Mar 10 15:48:21 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 15:48:21 crc restorecon[4748]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:21 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:48:22 crc restorecon[4748]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 15:48:23 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:48:23 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 15:48:23 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:48:23 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:48:23 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 15:48:23 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.385707 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391227 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391249 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391254 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391260 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391264 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391270 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391275 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391281 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391287 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391292 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391297 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391301 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391305 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391309 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391313 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391317 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391326 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391329 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391335 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391340 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391344 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391347 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391352 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391356 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391360 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391364 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391367 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391373 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391377 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391381 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391384 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391402 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391406 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391411 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391414 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391420 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391425 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391433 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391437 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391442 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391446 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391451 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391455 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391459 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391463 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391467 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391472 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391475 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391479 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391483 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391487 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391491 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391495 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391498 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391502 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391505 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391509 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391512 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391516 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391531 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391556 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391561 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391566 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391570 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391573 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391577 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391581 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391585 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391589 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391592 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.391596 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.392999 4749 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393016 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393026 4749 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393033 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393039 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393045 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393051 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393057 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393061 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393068 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393074 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393078 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393082 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393086 4749 flags.go:64] FLAG: --cgroup-root="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393091 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393095 4749 flags.go:64] FLAG: --client-ca-file="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393099 4749 flags.go:64] FLAG: --cloud-config="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393103 4749 flags.go:64] FLAG: --cloud-provider="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393107 4749 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393112 4749 flags.go:64] FLAG: --cluster-domain="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393116 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393121 4749 flags.go:64] FLAG: --config-dir="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393125 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393138 4749 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393145 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393150 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393155 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393160 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393165 4749 flags.go:64] FLAG: --contention-profiling="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393170 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393175 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393179 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393184 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393191 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393195 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393199 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393204 4749 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393208 4749 flags.go:64] FLAG: --enable-server="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393212 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393218 4749 flags.go:64] FLAG: --event-burst="100" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393222 4749 flags.go:64] FLAG: --event-qps="50" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393227 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393231 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393236 4749 flags.go:64] FLAG: --eviction-hard="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393242 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393246 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393250 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393255 4749 flags.go:64] FLAG: --eviction-soft="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393259 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393263 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393268 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393272 4749 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393276 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393280 4749 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393284 4749 flags.go:64] FLAG: --feature-gates="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393289 4749 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393293 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393298 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393302 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393306 4749 flags.go:64] FLAG: --healthz-port="10248" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393310 4749 flags.go:64] FLAG: --help="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393314 4749 flags.go:64] FLAG: --hostname-override="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393318 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393323 4749 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393327 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393331 4749 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393335 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393339 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393344 4749 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393349 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393362 4749 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393412 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393419 4749 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393426 4749 flags.go:64] FLAG: --kube-reserved="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393431 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393438 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393444 4749 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393449 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393453 4749 flags.go:64] FLAG: --lock-file="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393457 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393462 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393466 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393473 4749 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393477 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393481 4749 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393487 4749 flags.go:64] FLAG: --logging-format="text" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393492 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393497 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393503 4749 flags.go:64] FLAG: --manifest-url="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393509 4749 flags.go:64] FLAG: --manifest-url-header="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393516 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393521 4749 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393527 4749 flags.go:64] FLAG: --max-pods="110" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393533 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393538 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393543 4749 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393548 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393553 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393557 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393561 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393572 4749 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393577 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393581 4749 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393585 4749 flags.go:64] FLAG: --pod-cidr="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393590 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393597 4749 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393602 4749 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393606 4749 flags.go:64] FLAG: --pods-per-core="0" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393611 4749 flags.go:64] FLAG: --port="10250" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393615 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393619 4749 flags.go:64] FLAG: --provider-id="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393623 4749 flags.go:64] FLAG: --qos-reserved="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393628 4749 flags.go:64] FLAG: --read-only-port="10255" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393632 4749 flags.go:64] FLAG: --register-node="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393637 4749 flags.go:64] FLAG: --register-schedulable="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393642 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393651 4749 flags.go:64] FLAG: --registry-burst="10" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393656 4749 flags.go:64] FLAG: --registry-qps="5" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393661 4749 flags.go:64] FLAG: --reserved-cpus="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393667 4749 flags.go:64] FLAG: --reserved-memory="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393674 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393679 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393683 4749 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393688 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393692 4749 flags.go:64] FLAG: --runonce="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393697 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393703 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393714 4749 flags.go:64] FLAG: --seccomp-default="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393720 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393726 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393731 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393737 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393743 4749 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393749 4749 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393754 4749 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393759 4749 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393764 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393770 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393780 4749 flags.go:64] FLAG: --system-cgroups="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393785 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393794 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393799 4749 flags.go:64] FLAG: --tls-cert-file="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393802 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393809 4749 flags.go:64] FLAG: --tls-min-version="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393813 4749 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393817 4749 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393821 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393825 4749 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393830 4749 flags.go:64] FLAG: --v="2" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393837 4749 flags.go:64] FLAG: --version="false" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393843 4749 flags.go:64] FLAG: --vmodule="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393848 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.393853 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393953 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393959 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393963 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393967 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393971 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393975 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393979 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393984 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393988 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393992 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.393997 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394002 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394007 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394013 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394018 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394022 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394026 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394033 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394037 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394042 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394047 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394052 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394061 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394068 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394074 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394078 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394083 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394087 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394092 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394096 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394100 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394104 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394108 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394113 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394118 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394122 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394126 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394130 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394134 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394137 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394141 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394144 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394148 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394151 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394154 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394158 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394161 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394165 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394169 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394175 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394178 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394181 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394185 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394189 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394192 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394196 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394199 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394203 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394206 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394210 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394214 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394217 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394222 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394225 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394229 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394232 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394235 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394239 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394242 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394246 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.394249 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.394256 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.405000 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.405053 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405121 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405131 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405135 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405139 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405143 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405148 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405153 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405158 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405165 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405169 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405173 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405178 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405181 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405185 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405188 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405192 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405196 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405200 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405204 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405209 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405214 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405219 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405223 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405228 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405233 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405237 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405242 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405247 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405252 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405255 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405259 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405263 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405266 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405270 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405275 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405278 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405282 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405285 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405291 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405294 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405298 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405302 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405305 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405309 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405313 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405316 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405319 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405323 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405327 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405330 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405334 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405337 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405343 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405347 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405351 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405355 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405358 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405363 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405367 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405375 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405378 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405382 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405401 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405406 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405410 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405414 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405418 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405421 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405425 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405429 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405434 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.405442 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405829 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405838 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405843 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405847 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405850 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405854 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405857 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405862 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405866 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405870 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405873 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405877 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405882 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405887 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405892 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405897 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405901 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405905 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405909 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405914 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405918 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405922 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405926 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405929 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405933 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405936 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405940 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405943 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405947 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405951 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405955 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405959 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405962 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405966 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405970 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405974 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405977 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405981 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405984 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405988 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405991 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.405996 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406000 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406003 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406006 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406010 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406028 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406034 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406040 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406046 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406051 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406057 4749 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406062 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406066 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406071 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406075 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406080 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406084 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406089 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406092 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406097 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406101 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406105 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406109 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406113 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406117 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406121 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406126 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406130 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406134 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.406140 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.406148 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.406340 4749 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.409833 4749 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.414248 4749 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.414423 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.416360 4749 server.go:997] "Starting client certificate rotation" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.416425 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.416626 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.443085 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.448165 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.448718 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.467121 4749 log.go:25] "Validated CRI v1 runtime API" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.501266 4749 log.go:25] "Validated CRI v1 image API" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.503132 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.507344 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-15-43-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.507381 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.522261 4749 manager.go:217] Machine: {Timestamp:2026-03-10 15:48:23.520472367 +0000 UTC m=+0.642338074 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:daf2981f-1789-4491-b9fa-78a944145505 BootID:baf5cb54-c273-4495-b7cb-c1fd4f825d5e Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:06:08:5e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:06:08:5e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:dc:91:54 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a5:9d:0d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:21:e1:56 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ca:25:c3 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:de:b2:b0 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:aa:91:15:78:d6:c5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:78:a7:67:9d:63 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.522668 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.522945 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.524491 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.525208 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.525254 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.525503 4749 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.525512 4749 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.525980 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.526014 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.526301 4749 state_mem.go:36] "Initialized new in-memory state store" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.526434 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.530831 4749 kubelet.go:418] "Attempting to sync node with API server" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.530861 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.530924 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.530945 4749 kubelet.go:324] "Adding apiserver pod source" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.530967 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.537208 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.537334 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.538091 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.538295 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.539593 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.540713 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.542148 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544126 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544155 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544163 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544169 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544182 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544190 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544198 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544210 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544220 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544229 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544242 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544249 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.544870 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.545403 4749 server.go:1280] "Started kubelet" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.545570 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.545664 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.546531 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.546657 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.547643 4749 server.go:460] "Adding debug handlers to kubelet server" Mar 10 15:48:23 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.548276 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.548314 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.548755 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.548919 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.548959 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.549015 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="200ms" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.549057 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.549596 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.549690 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.551483 4749 factory.go:55] Registering systemd factory Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.551516 4749 factory.go:221] Registration of the systemd container factory successfully Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.552004 4749 factory.go:153] Registering CRI-O factory Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.552062 4749 factory.go:221] Registration of the crio container factory successfully Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.552151 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.552182 4749 factory.go:103] Registering Raw factory Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.552210 4749 manager.go:1196] Started watching for new ooms in manager Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.553057 4749 manager.go:319] Starting recovery of all containers Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.557601 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b85896a9176a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.545353896 +0000 UTC m=+0.667219583,LastTimestamp:2026-03-10 15:48:23.545353896 +0000 UTC m=+0.667219583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.558817 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559163 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559269 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559310 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559332 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559363 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559399 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559419 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559452 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559475 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559504 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559546 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559576 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559598 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559624 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559646 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559678 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559702 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559720 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559746 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559764 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559790 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559814 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559833 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559856 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.559873 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.560629 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.560730 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.560751 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.560782 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.560892 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561464 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561502 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561537 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561553 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561570 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561701 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561761 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561787 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561830 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561843 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561860 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561874 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561900 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561926 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561945 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561966 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.561982 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562005 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562022 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562054 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562077 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562124 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562143 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562166 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562186 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562200 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562216 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562228 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562240 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562253 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562264 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562279 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.562293 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.566939 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567013 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567039 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567057 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567074 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567090 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567105 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567119 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567136 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567151 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567164 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567178 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567190 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567202 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567215 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567227 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567239 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567250 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567262 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567275 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567287 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567297 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567310 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567328 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567339 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567350 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567361 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567376 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567408 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567433 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567456 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567468 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567480 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567490 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567501 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567513 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567524 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567539 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567556 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567578 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567596 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567627 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567650 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567665 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567681 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567695 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567707 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567720 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567732 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567743 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567755 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567766 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567782 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567798 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567813 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567827 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567842 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567855 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567870 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.567881 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568020 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568038 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568053 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568068 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568083 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568099 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568114 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568127 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568139 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568152 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568165 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568176 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568188 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568200 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568211 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568222 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568236 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568247 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568265 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568279 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568293 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568335 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568347 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568358 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568369 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568380 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568411 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568422 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568433 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568444 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568457 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568468 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568480 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568492 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568503 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568514 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568525 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568535 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568546 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568557 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568570 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568583 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568594 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568604 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568615 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568627 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568637 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568668 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568682 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568692 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568703 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568712 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568723 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568734 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568746 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568763 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568780 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568796 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568817 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568831 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568844 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568855 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568864 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568875 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568888 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568900 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568911 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568924 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568935 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568945 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568955 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568965 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568976 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568986 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.568997 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569008 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569020 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569032 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569043 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569055 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569066 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569077 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569088 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569100 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569110 4749 reconstruct.go:97] "Volume reconstruction finished" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.569118 4749 reconciler.go:26] "Reconciler: start to sync state" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.582003 4749 manager.go:324] Recovery completed Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.592080 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.595758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.595822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.595836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.599557 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.599595 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.599635 4749 state_mem.go:36] "Initialized new in-memory state store" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.602633 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.605040 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.605118 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.605430 4749 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.605508 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 15:48:23 crc kubenswrapper[4749]: W0310 15:48:23.606418 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.606473 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.612398 4749 policy_none.go:49] "None policy: Start" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.613428 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.613713 4749 state_mem.go:35] "Initializing new in-memory state store" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.648951 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.653688 4749 manager.go:334] "Starting Device Plugin manager" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.653865 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.653890 4749 server.go:79] "Starting device plugin registration server" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.654437 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.654457 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.654697 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.654777 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.654795 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.664774 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.706094 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.706267 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.707875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.707919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.707932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.708087 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.708603 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.708786 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.709367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.709436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.709449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.709716 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.709924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.710016 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.710468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.710588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.710635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.711840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.711873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.711887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.712202 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.712286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.712394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.712516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.713170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.713224 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.713632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.713665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.713680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.713852 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.714015 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.714086 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715744 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.715786 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.721659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.721715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.721727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.750892 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="400ms" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.754929 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.756488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.756526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.756535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.756566 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.756901 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770834 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.770920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.871940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.871997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.872604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.958090 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.959590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.959649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.959666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:23 crc kubenswrapper[4749]: I0310 15:48:23.959704 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:23 crc kubenswrapper[4749]: E0310 15:48:23.960283 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.039841 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.057808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.069043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.076789 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7eb8c630fa006ff25a5c2e18d9fd55de1729f659b05b28249c407c93e7983033 WatchSource:0}: Error finding container 7eb8c630fa006ff25a5c2e18d9fd55de1729f659b05b28249c407c93e7983033: Status 404 returned error can't find the container with id 7eb8c630fa006ff25a5c2e18d9fd55de1729f659b05b28249c407c93e7983033 Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.087107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.089718 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-aaa3439b549f564f4dbd9da52aba14be01166536f04a74f65e52b30a602673dd WatchSource:0}: Error finding container aaa3439b549f564f4dbd9da52aba14be01166536f04a74f65e52b30a602673dd: Status 404 returned error can't find the container with id aaa3439b549f564f4dbd9da52aba14be01166536f04a74f65e52b30a602673dd Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.091418 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.092010 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8bdaed6b30e029650a364b4d91bb04cec363e3449190f8cbce8cb39ab63ab9c0 WatchSource:0}: Error finding container 8bdaed6b30e029650a364b4d91bb04cec363e3449190f8cbce8cb39ab63ab9c0: Status 404 returned error can't find the container with id 8bdaed6b30e029650a364b4d91bb04cec363e3449190f8cbce8cb39ab63ab9c0 Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.114174 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-09288da3b83a7cf595ac9e29eb2a4182eeb9ffbd4521fc379a59726f73b7cdc0 WatchSource:0}: Error finding container 09288da3b83a7cf595ac9e29eb2a4182eeb9ffbd4521fc379a59726f73b7cdc0: Status 404 returned error can't find the container with id 09288da3b83a7cf595ac9e29eb2a4182eeb9ffbd4521fc379a59726f73b7cdc0 Mar 10 15:48:24 crc kubenswrapper[4749]: E0310 15:48:24.152824 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="800ms" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.360782 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.362582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.362626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.362637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.362669 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:24 crc kubenswrapper[4749]: E0310 15:48:24.363174 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.481310 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:24 crc kubenswrapper[4749]: E0310 15:48:24.481456 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.548104 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.610374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"aaa3439b549f564f4dbd9da52aba14be01166536f04a74f65e52b30a602673dd"} Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.611776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7eb8c630fa006ff25a5c2e18d9fd55de1729f659b05b28249c407c93e7983033"} Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.613681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09288da3b83a7cf595ac9e29eb2a4182eeb9ffbd4521fc379a59726f73b7cdc0"} Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.614902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44fba5a4922ba6b8bcbf657d2575e12c41855c9d6de9c5ebcfcd14111b3f35a9"} Mar 10 15:48:24 crc kubenswrapper[4749]: I0310 15:48:24.616734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8bdaed6b30e029650a364b4d91bb04cec363e3449190f8cbce8cb39ab63ab9c0"} Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.707158 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:24 crc kubenswrapper[4749]: E0310 15:48:24.707238 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:24 crc kubenswrapper[4749]: W0310 15:48:24.942490 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:24 crc kubenswrapper[4749]: E0310 15:48:24.942645 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:24 crc kubenswrapper[4749]: E0310 15:48:24.954278 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="1.6s" Mar 10 15:48:25 crc kubenswrapper[4749]: W0310 15:48:25.161737 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:25 crc kubenswrapper[4749]: E0310 15:48:25.161840 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.163871 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.165761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.165821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.165839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.165881 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:25 crc kubenswrapper[4749]: E0310 15:48:25.166621 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.548496 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.617773 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:48:25 crc kubenswrapper[4749]: E0310 15:48:25.618967 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.625310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.625358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.625369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.625381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.625483 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.626229 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.626256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.626265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.628734 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777" exitCode=0 Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.628784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.628870 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.630063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.630087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.630097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.631665 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76" exitCode=0 Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.631710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.631797 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.632330 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.632855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.632874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.632882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.633325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.633340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.633348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.636178 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b" exitCode=0 Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.636248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.636306 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.637212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.637234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.637243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.639516 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7" exitCode=0 Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.639538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7"} Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.639599 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.640257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.640279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:25 crc kubenswrapper[4749]: I0310 15:48:25.640288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:26 crc kubenswrapper[4749]: W0310 15:48:26.249775 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:26 crc kubenswrapper[4749]: E0310 15:48:26.250144 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.548206 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:26 crc kubenswrapper[4749]: E0310 15:48:26.555151 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="3.2s" Mar 10 15:48:26 crc kubenswrapper[4749]: W0310 15:48:26.575001 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:26 crc kubenswrapper[4749]: E0310 15:48:26.575129 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.650092 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297" exitCode=0 Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.650148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.650285 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.653250 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.653325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.653345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.655419 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.655375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.656407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.656453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.656466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.660722 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.660856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.660922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.660936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.662433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.662506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.662525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.664883 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.665290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.665358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.665406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.665425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5"} Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.665976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.666020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.666033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.767460 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.769251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.769312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.769327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:26 crc kubenswrapper[4749]: I0310 15:48:26.769361 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:26 crc kubenswrapper[4749]: E0310 15:48:26.770014 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Mar 10 15:48:27 crc kubenswrapper[4749]: W0310 15:48:27.032881 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Mar 10 15:48:27 crc kubenswrapper[4749]: E0310 15:48:27.033009 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.672531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"91e3c4031d58beb6aa76bb453acbee198776daae17c8e1bd865ab2d57bd6ce4a"} Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.672662 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.673988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.674039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.674050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.675471 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b" exitCode=0 Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.675569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b"} Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.675594 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.675695 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.675723 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.675722 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:27 crc kubenswrapper[4749]: I0310 15:48:27.677294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.079144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.682708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358"} Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.682791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e"} Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.682832 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.682918 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.684185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.684259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:28 crc kubenswrapper[4749]: I0310 15:48:28.684275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.694135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d"} Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.694219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea"} Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.694237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4"} Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.694251 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.694322 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.695090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.695122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.695132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.695559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.695599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.695614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.970180 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.972026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.972100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.972114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:29 crc kubenswrapper[4749]: I0310 15:48:29.972155 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.004494 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.004773 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.006283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.006338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.006351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.015589 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.697040 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.698164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.698212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.698225 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:30 crc kubenswrapper[4749]: I0310 15:48:30.828218 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 15:48:31 crc kubenswrapper[4749]: I0310 15:48:31.700063 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:31 crc kubenswrapper[4749]: I0310 15:48:31.701296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:31 crc kubenswrapper[4749]: I0310 15:48:31.701348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:31 crc kubenswrapper[4749]: I0310 15:48:31.701360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:32 crc kubenswrapper[4749]: I0310 15:48:32.359684 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:32 crc kubenswrapper[4749]: I0310 15:48:32.359933 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:32 crc kubenswrapper[4749]: I0310 15:48:32.361130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:32 crc kubenswrapper[4749]: I0310 15:48:32.361180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:32 crc kubenswrapper[4749]: I0310 15:48:32.361192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.663097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.663285 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.664848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.664893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.664907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:33 crc kubenswrapper[4749]: E0310 15:48:33.665219 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.665791 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.665941 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.667451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.667511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.667525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.668113 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.705958 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.706127 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.710320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.710439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:33 crc kubenswrapper[4749]: I0310 15:48:33.710463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:34 crc kubenswrapper[4749]: I0310 15:48:34.177526 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:34 crc kubenswrapper[4749]: I0310 15:48:34.708350 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:34 crc kubenswrapper[4749]: I0310 15:48:34.709601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:34 crc kubenswrapper[4749]: I0310 15:48:34.709647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:34 crc kubenswrapper[4749]: I0310 15:48:34.709657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:34 crc kubenswrapper[4749]: I0310 15:48:34.713601 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:35 crc kubenswrapper[4749]: I0310 15:48:35.711559 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:35 crc kubenswrapper[4749]: I0310 15:48:35.712759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:35 crc kubenswrapper[4749]: I0310 15:48:35.712813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:35 crc kubenswrapper[4749]: I0310 15:48:35.712828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.408219 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.408713 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.410554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.410592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.410604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.714450 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.715519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.715603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:36 crc kubenswrapper[4749]: I0310 15:48:36.715614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.177978 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.178073 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.358024 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42040->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.358103 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42040->192.168.126.11:17697: read: connection reset by peer" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.549177 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.718243 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.720628 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="91e3c4031d58beb6aa76bb453acbee198776daae17c8e1bd865ab2d57bd6ce4a" exitCode=255 Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.720683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"91e3c4031d58beb6aa76bb453acbee198776daae17c8e1bd865ab2d57bd6ce4a"} Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.720854 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.721660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.721693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.721705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:37 crc kubenswrapper[4749]: I0310 15:48:37.722244 4749 scope.go:117] "RemoveContainer" containerID="91e3c4031d58beb6aa76bb453acbee198776daae17c8e1bd865ab2d57bd6ce4a" Mar 10 15:48:38 crc kubenswrapper[4749]: W0310 15:48:38.206009 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.206148 4749 trace.go:236] Trace[1407262366]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 15:48:28.204) (total time: 10001ms): Mar 10 15:48:38 crc kubenswrapper[4749]: Trace[1407262366]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:48:38.205) Mar 10 15:48:38 crc kubenswrapper[4749]: Trace[1407262366]: [10.001948633s] [10.001948633s] END Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.206182 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.434512 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.434543 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:48:38 crc kubenswrapper[4749]: W0310 15:48:38.440352 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.440471 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.444118 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 15:48:38 crc kubenswrapper[4749]: W0310 15:48:38.444423 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.444540 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.445014 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b85896a9176a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.545353896 +0000 UTC m=+0.667219583,LastTimestamp:2026-03-10 15:48:23.545353896 +0000 UTC m=+0.667219583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.446293 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.446396 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 15:48:38 crc kubenswrapper[4749]: W0310 15:48:38.446550 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z Mar 10 15:48:38 crc kubenswrapper[4749]: E0310 15:48:38.446618 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.450808 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.450883 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.553545 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:38Z is after 2026-02-23T05:33:13Z Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.726498 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.728436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1"} Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.728615 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.729516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.729565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:38 crc kubenswrapper[4749]: I0310 15:48:38.729579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.550785 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:39Z is after 2026-02-23T05:33:13Z Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.736401 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.736830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.739061 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" exitCode=255 Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.739138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1"} Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.739239 4749 scope.go:117] "RemoveContainer" containerID="91e3c4031d58beb6aa76bb453acbee198776daae17c8e1bd865ab2d57bd6ce4a" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.739391 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.740610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.740681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.740698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:39 crc kubenswrapper[4749]: I0310 15:48:39.741479 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:48:39 crc kubenswrapper[4749]: E0310 15:48:39.741697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:48:40 crc kubenswrapper[4749]: I0310 15:48:40.551004 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:40Z is after 2026-02-23T05:33:13Z Mar 10 15:48:40 crc kubenswrapper[4749]: I0310 15:48:40.742712 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.062713 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.062987 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.064949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.064994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.065012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.065854 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:48:41 crc kubenswrapper[4749]: E0310 15:48:41.066132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:48:41 crc kubenswrapper[4749]: I0310 15:48:41.550888 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:41Z is after 2026-02-23T05:33:13Z Mar 10 15:48:41 crc kubenswrapper[4749]: W0310 15:48:41.586813 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:41Z is after 2026-02-23T05:33:13Z Mar 10 15:48:41 crc kubenswrapper[4749]: E0310 15:48:41.586911 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.364574 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.365459 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.367161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.367254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.367292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.368162 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:48:42 crc kubenswrapper[4749]: E0310 15:48:42.368436 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.369882 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.552535 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:42Z is after 2026-02-23T05:33:13Z Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.750816 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.752022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.752072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.752083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:42 crc kubenswrapper[4749]: I0310 15:48:42.752691 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:48:42 crc kubenswrapper[4749]: E0310 15:48:42.752920 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:48:43 crc kubenswrapper[4749]: I0310 15:48:43.551676 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:48:43Z is after 2026-02-23T05:33:13Z Mar 10 15:48:43 crc kubenswrapper[4749]: E0310 15:48:43.665407 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:48:44 crc kubenswrapper[4749]: I0310 15:48:44.551956 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:44 crc kubenswrapper[4749]: I0310 15:48:44.835449 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:44 crc kubenswrapper[4749]: I0310 15:48:44.836918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:44 crc kubenswrapper[4749]: I0310 15:48:44.836966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:44 crc kubenswrapper[4749]: I0310 15:48:44.836983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:44 crc kubenswrapper[4749]: I0310 15:48:44.837016 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:44 crc kubenswrapper[4749]: E0310 15:48:44.843728 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:48:44 crc kubenswrapper[4749]: E0310 15:48:44.850070 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:48:45 crc kubenswrapper[4749]: I0310 15:48:45.553256 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.432119 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.432920 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.437239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.437313 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.437325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.438252 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:48:46 crc kubenswrapper[4749]: E0310 15:48:46.438498 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.446331 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.446568 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.447961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.448010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.448023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.460524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.552292 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.760222 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.761477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.761529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.761544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.967992 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:48:46 crc kubenswrapper[4749]: I0310 15:48:46.984241 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 15:48:47 crc kubenswrapper[4749]: W0310 15:48:47.167328 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:47 crc kubenswrapper[4749]: E0310 15:48:47.167393 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 15:48:47 crc kubenswrapper[4749]: I0310 15:48:47.178913 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:48:47 crc kubenswrapper[4749]: I0310 15:48:47.179049 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:48:47 crc kubenswrapper[4749]: I0310 15:48:47.551587 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.450669 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896a9176a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.545353896 +0000 UTC m=+0.667219583,LastTimestamp:2026-03-10 15:48:23.545353896 +0000 UTC m=+0.667219583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.455132 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.459563 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.464269 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.468691 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b8589712dd410 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.65626472 +0000 UTC m=+0.778130407,LastTimestamp:2026-03-10 15:48:23.65626472 +0000 UTC m=+0.778130407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.475457 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.707902651 +0000 UTC m=+0.829768348,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.480388 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.707927812 +0000 UTC m=+0.829793499,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.487516 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93da62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.707938332 +0000 UTC m=+0.829804019,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.493908 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.709415161 +0000 UTC m=+0.831280848,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.499368 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.709444431 +0000 UTC m=+0.831310118,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.505282 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93da62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.709460652 +0000 UTC m=+0.831326329,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.510813 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.71050356 +0000 UTC m=+0.832369257,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.515775 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.710616683 +0000 UTC m=+0.832482380,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.520218 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93da62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.710651804 +0000 UTC m=+0.832517491,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.526286 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.711864366 +0000 UTC m=+0.833730053,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.531458 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.711881467 +0000 UTC m=+0.833747154,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.536265 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93da62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.711896167 +0000 UTC m=+0.833761854,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.540440 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.712327199 +0000 UTC m=+0.834192886,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.545043 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.712412742 +0000 UTC m=+0.834278429,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: I0310 15:48:48.549698 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.571521 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93da62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.712539405 +0000 UTC m=+0.834405092,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.573441 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.713651674 +0000 UTC m=+0.835517361,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.580653 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.713674825 +0000 UTC m=+0.835540512,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.587457 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93da62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93da62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595842146 +0000 UTC m=+0.717707833,LastTimestamp:2026-03-10 15:48:23.713687355 +0000 UTC m=+0.835553042,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.594752 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d935889\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d935889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595808905 +0000 UTC m=+0.717674592,LastTimestamp:2026-03-10 15:48:23.715147394 +0000 UTC m=+0.837013081,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.599743 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b85896d93b26c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b85896d93b26c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:23.595831916 +0000 UTC m=+0.717697603,LastTimestamp:2026-03-10 15:48:23.715164544 +0000 UTC m=+0.837030231,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.604240 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b85898ac436c1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.085550785 +0000 UTC m=+1.207416482,LastTimestamp:2026-03-10 15:48:24.085550785 +0000 UTC m=+1.207416482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.608176 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b85898b3f7a3f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.093628991 +0000 UTC m=+1.215494718,LastTimestamp:2026-03-10 15:48:24.093628991 +0000 UTC m=+1.215494718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.611671 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b85898b4e73fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.094610426 +0000 UTC m=+1.216476153,LastTimestamp:2026-03-10 15:48:24.094610426 +0000 UTC m=+1.216476153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.615292 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b85898c768839 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.114014265 +0000 UTC m=+1.235879952,LastTimestamp:2026-03-10 15:48:24.114014265 +0000 UTC m=+1.235879952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.619228 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b85898cba750c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.118465804 +0000 UTC m=+1.240331491,LastTimestamp:2026-03-10 15:48:24.118465804 +0000 UTC m=+1.240331491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.623882 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8589b0abd86d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.721487981 +0000 UTC m=+1.843353658,LastTimestamp:2026-03-10 15:48:24.721487981 +0000 UTC m=+1.843353658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.628435 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8589b0cf22b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.723800753 +0000 UTC m=+1.845666440,LastTimestamp:2026-03-10 15:48:24.723800753 +0000 UTC m=+1.845666440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.633452 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589b0d21642 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.723994178 +0000 UTC m=+1.845859865,LastTimestamp:2026-03-10 15:48:24.723994178 +0000 UTC m=+1.845859865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.638117 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8589b0d84fe5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.724402149 +0000 UTC m=+1.846267836,LastTimestamp:2026-03-10 15:48:24.724402149 +0000 UTC m=+1.846267836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.643098 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8589b0e14a15 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.724990485 +0000 UTC m=+1.846856172,LastTimestamp:2026-03-10 15:48:24.724990485 +0000 UTC m=+1.846856172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.648411 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8589b152c1da openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.732426714 +0000 UTC m=+1.854292401,LastTimestamp:2026-03-10 15:48:24.732426714 +0000 UTC m=+1.854292401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.653302 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589b199a35f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.737071967 +0000 UTC m=+1.858937654,LastTimestamp:2026-03-10 15:48:24.737071967 +0000 UTC m=+1.858937654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.657477 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589b1aa6659 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.738170457 +0000 UTC m=+1.860036144,LastTimestamp:2026-03-10 15:48:24.738170457 +0000 UTC m=+1.860036144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.662433 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8589b1abdd2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.73826641 +0000 UTC m=+1.860132097,LastTimestamp:2026-03-10 15:48:24.73826641 +0000 UTC m=+1.860132097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.667128 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8589b1c190ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.739688687 +0000 UTC m=+1.861554374,LastTimestamp:2026-03-10 15:48:24.739688687 +0000 UTC m=+1.861554374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.671783 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8589b1cded1f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.740498719 +0000 UTC m=+1.862364406,LastTimestamp:2026-03-10 15:48:24.740498719 +0000 UTC m=+1.862364406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.675496 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589c2e491ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.027195342 +0000 UTC m=+2.149061029,LastTimestamp:2026-03-10 15:48:25.027195342 +0000 UTC m=+2.149061029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.680490 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589c39e2a8f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.039358607 +0000 UTC m=+2.161224294,LastTimestamp:2026-03-10 15:48:25.039358607 +0000 UTC m=+2.161224294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.685365 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589c3b72af7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.040997111 +0000 UTC m=+2.162862808,LastTimestamp:2026-03-10 15:48:25.040997111 +0000 UTC m=+2.162862808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.689830 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589cf9e58ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.240697038 +0000 UTC m=+2.362562725,LastTimestamp:2026-03-10 15:48:25.240697038 +0000 UTC m=+2.362562725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.694225 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589d04dbb3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.252191035 +0000 UTC m=+2.374056712,LastTimestamp:2026-03-10 15:48:25.252191035 +0000 UTC m=+2.374056712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.698534 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589d06bf626 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.254172198 +0000 UTC m=+2.376037885,LastTimestamp:2026-03-10 15:48:25.254172198 +0000 UTC m=+2.376037885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.703092 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589da6eb8fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.422125307 +0000 UTC m=+2.543991014,LastTimestamp:2026-03-10 15:48:25.422125307 +0000 UTC m=+2.543991014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.707617 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589db25e856 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.434130518 +0000 UTC m=+2.555996205,LastTimestamp:2026-03-10 15:48:25.434130518 +0000 UTC m=+2.555996205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.712315 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8589e6f415ee openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.632191982 +0000 UTC m=+2.754057659,LastTimestamp:2026-03-10 15:48:25.632191982 +0000 UTC m=+2.754057659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.716931 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8589e71a2f47 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.634688839 +0000 UTC m=+2.756554526,LastTimestamp:2026-03-10 15:48:25.634688839 +0000 UTC m=+2.756554526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.721366 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8589e7b20c68 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.644641384 +0000 UTC m=+2.766507071,LastTimestamp:2026-03-10 15:48:25.644641384 +0000 UTC m=+2.766507071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.725559 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8589e7c2e538 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.645745464 +0000 UTC m=+2.767611151,LastTimestamp:2026-03-10 15:48:25.645745464 +0000 UTC m=+2.767611151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.730076 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8589f59397e2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.877526498 +0000 UTC m=+2.999392185,LastTimestamp:2026-03-10 15:48:25.877526498 +0000 UTC m=+2.999392185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.734778 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8589f5da209e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.882149022 +0000 UTC m=+3.004014709,LastTimestamp:2026-03-10 15:48:25.882149022 +0000 UTC m=+3.004014709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.739683 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8589f6b4d089 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.896480905 +0000 UTC m=+3.018346592,LastTimestamp:2026-03-10 15:48:25.896480905 +0000 UTC m=+3.018346592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.744603 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8589f6f2da7c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.900546684 +0000 UTC m=+3.022412371,LastTimestamp:2026-03-10 15:48:25.900546684 +0000 UTC m=+3.022412371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.748795 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8589f6f3aa12 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.900599826 +0000 UTC m=+3.022465533,LastTimestamp:2026-03-10 15:48:25.900599826 +0000 UTC m=+3.022465533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.754113 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b8589f6f618a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.9007592 +0000 UTC m=+3.022624887,LastTimestamp:2026-03-10 15:48:25.9007592 +0000 UTC m=+3.022624887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.759393 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b8589f7074486 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.90188455 +0000 UTC m=+3.023750237,LastTimestamp:2026-03-10 15:48:25.90188455 +0000 UTC m=+3.023750237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.765412 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8589f7713dce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.908829646 +0000 UTC m=+3.030695333,LastTimestamp:2026-03-10 15:48:25.908829646 +0000 UTC m=+3.030695333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.770878 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b8589f7a078de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.911924958 +0000 UTC m=+3.033790645,LastTimestamp:2026-03-10 15:48:25.911924958 +0000 UTC m=+3.033790645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.776954 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b8589f8653ab0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.924819632 +0000 UTC m=+3.046685319,LastTimestamp:2026-03-10 15:48:25.924819632 +0000 UTC m=+3.046685319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.784443 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a029699ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.095827373 +0000 UTC m=+3.217693060,LastTimestamp:2026-03-10 15:48:26.095827373 +0000 UTC m=+3.217693060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.790362 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b858a029d664f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.096272975 +0000 UTC m=+3.218138662,LastTimestamp:2026-03-10 15:48:26.096272975 +0000 UTC m=+3.218138662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.794276 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b858a03a4e3a3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.113541027 +0000 UTC m=+3.235406714,LastTimestamp:2026-03-10 15:48:26.113541027 +0000 UTC m=+3.235406714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.798395 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b858a03c37f3c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.11554694 +0000 UTC m=+3.237412627,LastTimestamp:2026-03-10 15:48:26.11554694 +0000 UTC m=+3.237412627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.802350 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a03c6f1b8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.115772856 +0000 UTC m=+3.237638543,LastTimestamp:2026-03-10 15:48:26.115772856 +0000 UTC m=+3.237638543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.806936 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a03d4c2e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.116678371 +0000 UTC m=+3.238544058,LastTimestamp:2026-03-10 15:48:26.116678371 +0000 UTC m=+3.238544058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.811189 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b858a10c059ac openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.333444524 +0000 UTC m=+3.455310211,LastTimestamp:2026-03-10 15:48:26.333444524 +0000 UTC m=+3.455310211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.815941 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a110fae7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.338643583 +0000 UTC m=+3.460509290,LastTimestamp:2026-03-10 15:48:26.338643583 +0000 UTC m=+3.460509290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.820003 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b858a1172dcf0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.345143536 +0000 UTC m=+3.467009223,LastTimestamp:2026-03-10 15:48:26.345143536 +0000 UTC m=+3.467009223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.825328 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a11ce5cf7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.351140087 +0000 UTC m=+3.473005774,LastTimestamp:2026-03-10 15:48:26.351140087 +0000 UTC m=+3.473005774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.829819 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a11e6d4ec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.35274366 +0000 UTC m=+3.474609347,LastTimestamp:2026-03-10 15:48:26.35274366 +0000 UTC m=+3.474609347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.833738 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a1bd80a6e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.519546478 +0000 UTC m=+3.641412165,LastTimestamp:2026-03-10 15:48:26.519546478 +0000 UTC m=+3.641412165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.837985 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a1c9f6af9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.532612857 +0000 UTC m=+3.654478544,LastTimestamp:2026-03-10 15:48:26.532612857 +0000 UTC m=+3.654478544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.842296 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a1cb1dfe8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.53382244 +0000 UTC m=+3.655688127,LastTimestamp:2026-03-10 15:48:26.53382244 +0000 UTC m=+3.655688127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.848389 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a241dfbac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.658347948 +0000 UTC m=+3.780213655,LastTimestamp:2026-03-10 15:48:26.658347948 +0000 UTC m=+3.780213655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.853862 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a272bb171 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.709578097 +0000 UTC m=+3.831443784,LastTimestamp:2026-03-10 15:48:26.709578097 +0000 UTC m=+3.831443784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.858161 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a28260237 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.725982775 +0000 UTC m=+3.847848472,LastTimestamp:2026-03-10 15:48:26.725982775 +0000 UTC m=+3.847848472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.862403 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a2f40e7d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.845186002 +0000 UTC m=+3.967051689,LastTimestamp:2026-03-10 15:48:26.845186002 +0000 UTC m=+3.967051689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.866407 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a303c9079 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.861678713 +0000 UTC m=+3.983544400,LastTimestamp:2026-03-10 15:48:26.861678713 +0000 UTC m=+3.983544400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.871440 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a60f150e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:27.678830822 +0000 UTC m=+4.800696509,LastTimestamp:2026-03-10 15:48:27.678830822 +0000 UTC m=+4.800696509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.875019 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a6da47e99 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:27.891900057 +0000 UTC m=+5.013765764,LastTimestamp:2026-03-10 15:48:27.891900057 +0000 UTC m=+5.013765764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.879327 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a7085047c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:27.940168828 +0000 UTC m=+5.062034515,LastTimestamp:2026-03-10 15:48:27.940168828 +0000 UTC m=+5.062034515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.882766 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a709d0d4d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:27.941743949 +0000 UTC m=+5.063609626,LastTimestamp:2026-03-10 15:48:27.941743949 +0000 UTC m=+5.063609626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.886046 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a8ee0f7b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.44951135 +0000 UTC m=+5.571377037,LastTimestamp:2026-03-10 15:48:28.44951135 +0000 UTC m=+5.571377037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.890587 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a910bba14 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.485868052 +0000 UTC m=+5.607733739,LastTimestamp:2026-03-10 15:48:28.485868052 +0000 UTC m=+5.607733739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.894674 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a91263084 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.487602308 +0000 UTC m=+5.609467995,LastTimestamp:2026-03-10 15:48:28.487602308 +0000 UTC m=+5.609467995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.900300 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a9d22f5bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.688717244 +0000 UTC m=+5.810582931,LastTimestamp:2026-03-10 15:48:28.688717244 +0000 UTC m=+5.810582931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.907050 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a9e270377 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.705760119 +0000 UTC m=+5.827625806,LastTimestamp:2026-03-10 15:48:28.705760119 +0000 UTC m=+5.827625806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.913088 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858a9e3f5ede openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.707356382 +0000 UTC m=+5.829222069,LastTimestamp:2026-03-10 15:48:28.707356382 +0000 UTC m=+5.829222069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.917702 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858aa99e4ae1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.898126561 +0000 UTC m=+6.019992248,LastTimestamp:2026-03-10 15:48:28.898126561 +0000 UTC m=+6.019992248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.921986 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858aaa8f82a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.913935013 +0000 UTC m=+6.035800700,LastTimestamp:2026-03-10 15:48:28.913935013 +0000 UTC m=+6.035800700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.926064 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858aaaa69edc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:28.915449564 +0000 UTC m=+6.037315251,LastTimestamp:2026-03-10 15:48:28.915449564 +0000 UTC m=+6.037315251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.930863 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858ab5e3c2b5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:29.104005813 +0000 UTC m=+6.225871500,LastTimestamp:2026-03-10 15:48:29.104005813 +0000 UTC m=+6.225871500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.936435 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b858ab7fadda7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:29.139074471 +0000 UTC m=+6.260940168,LastTimestamp:2026-03-10 15:48:29.139074471 +0000 UTC m=+6.260940168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.943816 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:48:48 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189b858c9723d12f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:48:48 crc kubenswrapper[4749]: body: Mar 10 15:48:48 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:37.178044719 +0000 UTC m=+14.299910406,LastTimestamp:2026-03-10 15:48:37.178044719 +0000 UTC m=+14.299910406,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:48:48 crc kubenswrapper[4749]: > Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.948593 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b858c9724d1cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:37.178110411 +0000 UTC m=+14.299976098,LastTimestamp:2026-03-10 15:48:37.178110411 +0000 UTC m=+14.299976098,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.952684 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 15:48:48 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.189b858ca1def4f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:42040->192.168.126.11:17697: read: connection reset by peer Mar 10 15:48:48 crc kubenswrapper[4749]: body: Mar 10 15:48:48 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:37.358081271 +0000 UTC m=+14.479946958,LastTimestamp:2026-03-10 15:48:37.358081271 +0000 UTC m=+14.479946958,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:48:48 crc kubenswrapper[4749]: > Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.957178 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858ca1dfc3d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42040->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:37.358134233 +0000 UTC m=+14.479999920,LastTimestamp:2026-03-10 15:48:37.358134233 +0000 UTC m=+14.479999920,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.961820 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b858a1cb1dfe8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a1cb1dfe8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.53382244 +0000 UTC m=+3.655688127,LastTimestamp:2026-03-10 15:48:37.723300912 +0000 UTC m=+14.845166599,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.966744 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b858a272bb171\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a272bb171 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.709578097 +0000 UTC m=+3.831443784,LastTimestamp:2026-03-10 15:48:38.032076194 +0000 UTC m=+15.153941911,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.971795 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b858a28260237\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858a28260237 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:26.725982775 +0000 UTC m=+3.847848472,LastTimestamp:2026-03-10 15:48:38.066142745 +0000 UTC m=+15.188008432,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.977091 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 15:48:48 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.189b858ce2bcacd1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 15:48:48 crc kubenswrapper[4749]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:48:48 crc kubenswrapper[4749]: Mar 10 15:48:48 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:38.446353617 +0000 UTC m=+15.568219304,LastTimestamp:2026-03-10 15:48:38.446353617 +0000 UTC m=+15.568219304,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:48:48 crc kubenswrapper[4749]: > Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.981737 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b858ce2bdd5d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:38.446429649 +0000 UTC m=+15.568295336,LastTimestamp:2026-03-10 15:48:38.446429649 +0000 UTC m=+15.568295336,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.988607 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b858c9723d12f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:48:48 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189b858c9723d12f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:48:48 crc kubenswrapper[4749]: body: Mar 10 15:48:48 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:37.178044719 +0000 UTC m=+14.299910406,LastTimestamp:2026-03-10 15:48:47.179002037 +0000 UTC m=+24.300867734,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:48:48 crc kubenswrapper[4749]: > Mar 10 15:48:48 crc kubenswrapper[4749]: E0310 15:48:48.993260 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b858c9724d1cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b858c9724d1cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:37.178110411 +0000 UTC m=+14.299976098,LastTimestamp:2026-03-10 15:48:47.179089409 +0000 UTC m=+24.300955096,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:49 crc kubenswrapper[4749]: I0310 15:48:49.552493 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:50 crc kubenswrapper[4749]: W0310 15:48:50.370933 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 15:48:50 crc kubenswrapper[4749]: E0310 15:48:50.370997 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 15:48:50 crc kubenswrapper[4749]: I0310 15:48:50.552106 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:50 crc kubenswrapper[4749]: W0310 15:48:50.923306 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 15:48:50 crc kubenswrapper[4749]: E0310 15:48:50.923394 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 15:48:51 crc kubenswrapper[4749]: I0310 15:48:51.551737 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:51 crc kubenswrapper[4749]: I0310 15:48:51.844216 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:51 crc kubenswrapper[4749]: I0310 15:48:51.846218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:51 crc kubenswrapper[4749]: I0310 15:48:51.846303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:51 crc kubenswrapper[4749]: I0310 15:48:51.846325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:51 crc kubenswrapper[4749]: I0310 15:48:51.846370 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:51 crc kubenswrapper[4749]: E0310 15:48:51.853583 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:48:51 crc kubenswrapper[4749]: E0310 15:48:51.853682 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:48:52 crc kubenswrapper[4749]: W0310 15:48:52.028055 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 15:48:52 crc kubenswrapper[4749]: E0310 15:48:52.028117 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 15:48:52 crc kubenswrapper[4749]: I0310 15:48:52.551469 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:53 crc kubenswrapper[4749]: I0310 15:48:53.552396 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:53 crc kubenswrapper[4749]: E0310 15:48:53.665568 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:48:54 crc kubenswrapper[4749]: I0310 15:48:54.552945 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.552898 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.795473 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:45810->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.795590 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:45810->192.168.126.11:10357: read: connection reset by peer" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.795667 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.795901 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.797501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.797533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.797546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.798084 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 15:48:55 crc kubenswrapper[4749]: I0310 15:48:55.798277 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84" gracePeriod=30 Mar 10 15:48:55 crc kubenswrapper[4749]: E0310 15:48:55.802853 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:48:55 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.189b8590ecd4a8dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:45810->192.168.126.11:10357: read: connection reset by peer Mar 10 15:48:55 crc kubenswrapper[4749]: body: Mar 10 15:48:55 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:55.795566813 +0000 UTC m=+32.917432520,LastTimestamp:2026-03-10 15:48:55.795566813 +0000 UTC m=+32.917432520,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:48:55 crc kubenswrapper[4749]: > Mar 10 15:48:55 crc kubenswrapper[4749]: E0310 15:48:55.805707 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8590ecd59411 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:45810->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:55.795627025 +0000 UTC m=+32.917492712,LastTimestamp:2026-03-10 15:48:55.795627025 +0000 UTC m=+32.917492712,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:55 crc kubenswrapper[4749]: E0310 15:48:55.807771 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8590ecfdb03b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:55.798255675 +0000 UTC m=+32.920121372,LastTimestamp:2026-03-10 15:48:55.798255675 +0000 UTC m=+32.920121372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:56 crc kubenswrapper[4749]: E0310 15:48:56.336068 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b8589b1aa6659\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589b1aa6659 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:24.738170457 +0000 UTC m=+1.860036144,LastTimestamp:2026-03-10 15:48:56.332549976 +0000 UTC m=+33.454415663,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:56 crc kubenswrapper[4749]: E0310 15:48:56.536489 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b8589c2e491ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589c2e491ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.027195342 +0000 UTC m=+2.149061029,LastTimestamp:2026-03-10 15:48:56.527616979 +0000 UTC m=+33.649482666,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:56 crc kubenswrapper[4749]: E0310 15:48:56.543801 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b8589c39e2a8f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8589c39e2a8f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:48:25.039358607 +0000 UTC m=+2.161224294,LastTimestamp:2026-03-10 15:48:56.537800901 +0000 UTC m=+33.659666588,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.552621 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.789595 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.789958 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84" exitCode=255 Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.790002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84"} Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.790042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde"} Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.790153 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.791475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.791501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:56 crc kubenswrapper[4749]: I0310 15:48:56.791511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:57 crc kubenswrapper[4749]: I0310 15:48:57.552840 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:58 crc kubenswrapper[4749]: I0310 15:48:58.553295 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:48:58 crc kubenswrapper[4749]: I0310 15:48:58.854288 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:48:58 crc kubenswrapper[4749]: I0310 15:48:58.855974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:48:58 crc kubenswrapper[4749]: I0310 15:48:58.856015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:48:58 crc kubenswrapper[4749]: I0310 15:48:58.856025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:48:58 crc kubenswrapper[4749]: I0310 15:48:58.856051 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:48:58 crc kubenswrapper[4749]: E0310 15:48:58.859218 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:48:58 crc kubenswrapper[4749]: E0310 15:48:58.859488 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:48:59 crc kubenswrapper[4749]: I0310 15:48:59.553661 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.004940 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.005166 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.006788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.006854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.006935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.555044 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.606018 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.607403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.607537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.607647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.608402 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.804542 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.807130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c"} Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.807404 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.808414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.808466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:00 crc kubenswrapper[4749]: I0310 15:49:00.808483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.552547 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.811646 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.812390 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.814172 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" exitCode=255 Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.814225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c"} Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.814277 4749 scope.go:117] "RemoveContainer" containerID="e43a5e58920235e9c6d87a086d1d1503eae14e1037c6cd17682b0106801be4d1" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.814443 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.815269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.815301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.815314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:01 crc kubenswrapper[4749]: I0310 15:49:01.815932 4749 scope.go:117] "RemoveContainer" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" Mar 10 15:49:01 crc kubenswrapper[4749]: E0310 15:49:01.816126 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:02 crc kubenswrapper[4749]: I0310 15:49:02.552595 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:02 crc kubenswrapper[4749]: I0310 15:49:02.818757 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:49:03 crc kubenswrapper[4749]: I0310 15:49:03.552223 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:03 crc kubenswrapper[4749]: E0310 15:49:03.665757 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.177467 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.177682 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.179008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.179469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.179522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.185164 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.552770 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.826761 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.828166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.828222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:04 crc kubenswrapper[4749]: I0310 15:49:04.828237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:05 crc kubenswrapper[4749]: I0310 15:49:05.553413 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:05 crc kubenswrapper[4749]: I0310 15:49:05.859958 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:05 crc kubenswrapper[4749]: I0310 15:49:05.861533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:05 crc kubenswrapper[4749]: I0310 15:49:05.861596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:05 crc kubenswrapper[4749]: I0310 15:49:05.861612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:05 crc kubenswrapper[4749]: I0310 15:49:05.861652 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:49:05 crc kubenswrapper[4749]: E0310 15:49:05.865321 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:49:05 crc kubenswrapper[4749]: E0310 15:49:05.866289 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.432555 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.432723 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.433873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.433909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.433920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.434555 4749 scope.go:117] "RemoveContainer" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" Mar 10 15:49:06 crc kubenswrapper[4749]: E0310 15:49:06.434721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:06 crc kubenswrapper[4749]: I0310 15:49:06.550923 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:07 crc kubenswrapper[4749]: W0310 15:49:07.336737 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 15:49:07 crc kubenswrapper[4749]: E0310 15:49:07.336816 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 15:49:07 crc kubenswrapper[4749]: I0310 15:49:07.551726 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:08 crc kubenswrapper[4749]: I0310 15:49:08.552001 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:09 crc kubenswrapper[4749]: I0310 15:49:09.552596 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:09 crc kubenswrapper[4749]: W0310 15:49:09.565813 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:09 crc kubenswrapper[4749]: E0310 15:49:09.565882 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 15:49:09 crc kubenswrapper[4749]: W0310 15:49:09.578705 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 15:49:09 crc kubenswrapper[4749]: E0310 15:49:09.578813 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 15:49:10 crc kubenswrapper[4749]: I0310 15:49:10.008895 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:49:10 crc kubenswrapper[4749]: I0310 15:49:10.009108 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:10 crc kubenswrapper[4749]: I0310 15:49:10.010800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:10 crc kubenswrapper[4749]: I0310 15:49:10.010871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:10 crc kubenswrapper[4749]: I0310 15:49:10.010889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:10 crc kubenswrapper[4749]: I0310 15:49:10.554988 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.063185 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.063474 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.065199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.065280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.065296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.066160 4749 scope.go:117] "RemoveContainer" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" Mar 10 15:49:11 crc kubenswrapper[4749]: E0310 15:49:11.066421 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:11 crc kubenswrapper[4749]: I0310 15:49:11.552345 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:12 crc kubenswrapper[4749]: I0310 15:49:12.555433 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:12 crc kubenswrapper[4749]: I0310 15:49:12.866449 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:12 crc kubenswrapper[4749]: I0310 15:49:12.868282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:12 crc kubenswrapper[4749]: I0310 15:49:12.868330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:12 crc kubenswrapper[4749]: I0310 15:49:12.868341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:12 crc kubenswrapper[4749]: I0310 15:49:12.868393 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:49:12 crc kubenswrapper[4749]: E0310 15:49:12.871657 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:49:12 crc kubenswrapper[4749]: E0310 15:49:12.871661 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:49:13 crc kubenswrapper[4749]: W0310 15:49:13.030514 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 15:49:13 crc kubenswrapper[4749]: E0310 15:49:13.030596 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 15:49:13 crc kubenswrapper[4749]: I0310 15:49:13.553874 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:13 crc kubenswrapper[4749]: E0310 15:49:13.665912 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:49:13 crc kubenswrapper[4749]: I0310 15:49:13.669743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:49:13 crc kubenswrapper[4749]: I0310 15:49:13.669878 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:13 crc kubenswrapper[4749]: I0310 15:49:13.670923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:13 crc kubenswrapper[4749]: I0310 15:49:13.672638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:13 crc kubenswrapper[4749]: I0310 15:49:13.672783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:14 crc kubenswrapper[4749]: I0310 15:49:14.552719 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:15 crc kubenswrapper[4749]: I0310 15:49:15.552866 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:16 crc kubenswrapper[4749]: I0310 15:49:16.553774 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:17 crc kubenswrapper[4749]: I0310 15:49:17.553793 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:18 crc kubenswrapper[4749]: I0310 15:49:18.553062 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:19 crc kubenswrapper[4749]: I0310 15:49:19.553119 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:19 crc kubenswrapper[4749]: I0310 15:49:19.871989 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:19 crc kubenswrapper[4749]: I0310 15:49:19.873863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:19 crc kubenswrapper[4749]: I0310 15:49:19.873999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:19 crc kubenswrapper[4749]: I0310 15:49:19.874069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:19 crc kubenswrapper[4749]: I0310 15:49:19.874164 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:49:19 crc kubenswrapper[4749]: E0310 15:49:19.877922 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:49:19 crc kubenswrapper[4749]: E0310 15:49:19.877951 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:49:20 crc kubenswrapper[4749]: I0310 15:49:20.552163 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:21 crc kubenswrapper[4749]: I0310 15:49:21.551708 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:21 crc kubenswrapper[4749]: I0310 15:49:21.606509 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:21 crc kubenswrapper[4749]: I0310 15:49:21.607959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:21 crc kubenswrapper[4749]: I0310 15:49:21.608010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:21 crc kubenswrapper[4749]: I0310 15:49:21.608023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:21 crc kubenswrapper[4749]: I0310 15:49:21.608715 4749 scope.go:117] "RemoveContainer" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" Mar 10 15:49:21 crc kubenswrapper[4749]: E0310 15:49:21.608897 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:22 crc kubenswrapper[4749]: I0310 15:49:22.552169 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:23 crc kubenswrapper[4749]: I0310 15:49:23.552190 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:23 crc kubenswrapper[4749]: E0310 15:49:23.666889 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:49:24 crc kubenswrapper[4749]: I0310 15:49:24.552225 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:25 crc kubenswrapper[4749]: I0310 15:49:25.552661 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:26 crc kubenswrapper[4749]: I0310 15:49:26.552118 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:26 crc kubenswrapper[4749]: I0310 15:49:26.878306 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:26 crc kubenswrapper[4749]: I0310 15:49:26.880271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:26 crc kubenswrapper[4749]: I0310 15:49:26.880345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:26 crc kubenswrapper[4749]: I0310 15:49:26.880361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:26 crc kubenswrapper[4749]: I0310 15:49:26.880499 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:49:26 crc kubenswrapper[4749]: E0310 15:49:26.885911 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:49:26 crc kubenswrapper[4749]: E0310 15:49:26.885951 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:49:27 crc kubenswrapper[4749]: I0310 15:49:27.551264 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:28 crc kubenswrapper[4749]: I0310 15:49:28.551483 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:49:28 crc kubenswrapper[4749]: I0310 15:49:28.606434 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:28 crc kubenswrapper[4749]: I0310 15:49:28.607425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:28 crc kubenswrapper[4749]: I0310 15:49:28.607461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:28 crc kubenswrapper[4749]: I0310 15:49:28.607469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:29 crc kubenswrapper[4749]: I0310 15:49:29.071257 4749 csr.go:261] certificate signing request csr-l7w7p is approved, waiting to be issued Mar 10 15:49:29 crc kubenswrapper[4749]: I0310 15:49:29.086524 4749 csr.go:257] certificate signing request csr-l7w7p is issued Mar 10 15:49:29 crc kubenswrapper[4749]: I0310 15:49:29.132210 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 15:49:29 crc kubenswrapper[4749]: I0310 15:49:29.416038 4749 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 15:49:30 crc kubenswrapper[4749]: I0310 15:49:30.087925 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-15 21:45:14.367752326 +0000 UTC Mar 10 15:49:30 crc kubenswrapper[4749]: I0310 15:49:30.088602 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6725h55m44.279178971s for next certificate rotation Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.667961 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.886621 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.888155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.888205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.888216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.888336 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.896789 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.897300 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.897337 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.902052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.902098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.902109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.902127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.902139 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:33Z","lastTransitionTime":"2026-03-10T15:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.921217 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.928815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.928857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.928868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.928888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.928901 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:33Z","lastTransitionTime":"2026-03-10T15:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.942821 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.954339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.954415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.954428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.954448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.954460 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:33Z","lastTransitionTime":"2026-03-10T15:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.967939 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.974971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.975013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.975026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.975044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:33 crc kubenswrapper[4749]: I0310 15:49:33.975058 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:33Z","lastTransitionTime":"2026-03-10T15:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.988480 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.988650 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:49:33 crc kubenswrapper[4749]: E0310 15:49:33.988683 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.089012 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.189953 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.290097 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.391060 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.492070 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.592259 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.692438 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.792620 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.893319 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:34 crc kubenswrapper[4749]: E0310 15:49:34.994405 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.095474 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.196271 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.296509 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.397481 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.497861 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.598007 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.606431 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.607751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.607785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.607796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.608510 4749 scope.go:117] "RemoveContainer" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.698957 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.799776 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: E0310 15:49:35.900331 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.905146 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.908835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b"} Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.909015 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.910271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.910478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:35 crc kubenswrapper[4749]: I0310 15:49:35.910579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.001265 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.101656 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.201984 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.302160 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.403224 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.432662 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.503519 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.604638 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.705225 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.806440 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.907771 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.914881 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.915716 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.918542 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" exitCode=255 Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.918602 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b"} Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.918661 4749 scope.go:117] "RemoveContainer" containerID="87debf88d8138c2687875d36502657caeb62c21e5b52cad21232171cd4f2044c" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.918759 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.919908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.919979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.919999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:36 crc kubenswrapper[4749]: I0310 15:49:36.920804 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:49:36 crc kubenswrapper[4749]: E0310 15:49:36.921015 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.008563 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.109178 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.209914 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.310963 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.411750 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.512108 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.612619 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.712970 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.814302 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.914449 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:37 crc kubenswrapper[4749]: I0310 15:49:37.923478 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:49:37 crc kubenswrapper[4749]: I0310 15:49:37.925426 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:37 crc kubenswrapper[4749]: I0310 15:49:37.926475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:37 crc kubenswrapper[4749]: I0310 15:49:37.926562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:37 crc kubenswrapper[4749]: I0310 15:49:37.926590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:37 crc kubenswrapper[4749]: I0310 15:49:37.927510 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:49:37 crc kubenswrapper[4749]: E0310 15:49:37.927821 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.015332 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.116496 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.217747 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.318940 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.419354 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.520439 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.620777 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.721685 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.822068 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:38 crc kubenswrapper[4749]: E0310 15:49:38.923190 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.023345 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.123817 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.224776 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.325907 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.427130 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.527944 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.628984 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.729587 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.830684 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:39 crc kubenswrapper[4749]: E0310 15:49:39.931147 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.032197 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.132528 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.233667 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.334519 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: I0310 15:49:40.393295 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.435558 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.536471 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.636875 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.737579 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.838452 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:40 crc kubenswrapper[4749]: E0310 15:49:40.939407 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.039822 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: I0310 15:49:41.062341 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:49:41 crc kubenswrapper[4749]: I0310 15:49:41.062627 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:49:41 crc kubenswrapper[4749]: I0310 15:49:41.064162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:41 crc kubenswrapper[4749]: I0310 15:49:41.064198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:41 crc kubenswrapper[4749]: I0310 15:49:41.064211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:41 crc kubenswrapper[4749]: I0310 15:49:41.064949 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.065133 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.140475 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.240800 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.341183 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.442147 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.543009 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.643966 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.744317 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.844599 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:41 crc kubenswrapper[4749]: E0310 15:49:41.945571 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.046302 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.147017 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: I0310 15:49:42.187198 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.248173 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.348491 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.449229 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.549500 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.649696 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.749863 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.850775 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:42 crc kubenswrapper[4749]: E0310 15:49:42.951475 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.051786 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.152508 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.253557 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.354405 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.455168 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.556140 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.657108 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.668339 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.757453 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.858611 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:43 crc kubenswrapper[4749]: E0310 15:49:43.959272 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.059955 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.160880 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.199655 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.205112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.205158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.205171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.205191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.205202 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:44Z","lastTransitionTime":"2026-03-10T15:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.215299 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.218682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.218725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.218738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.218759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.218773 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:44Z","lastTransitionTime":"2026-03-10T15:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.230969 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.235017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.235081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.235094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.235115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.235128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:44Z","lastTransitionTime":"2026-03-10T15:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.247280 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.252578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.252627 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.252637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.252657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:44 crc kubenswrapper[4749]: I0310 15:49:44.252668 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:44Z","lastTransitionTime":"2026-03-10T15:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.263688 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.263818 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.263853 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.364472 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.464756 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.565213 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.666326 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.766727 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.867893 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:44 crc kubenswrapper[4749]: E0310 15:49:44.968845 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.070020 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.170968 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.271825 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.372231 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.473320 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.573602 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.674456 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.775064 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.875214 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:45 crc kubenswrapper[4749]: E0310 15:49:45.975560 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.076714 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.177647 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.278451 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.379043 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.480174 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.581289 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.681911 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.782417 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.882757 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:46 crc kubenswrapper[4749]: E0310 15:49:46.983102 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.084076 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.184993 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: I0310 15:49:47.271317 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.285650 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.386784 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.488005 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.588709 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.689632 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.789782 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.890952 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:47 crc kubenswrapper[4749]: E0310 15:49:47.992147 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.092801 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.193856 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.295013 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.395123 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.495493 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.595611 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.696535 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.797307 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.898515 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:48 crc kubenswrapper[4749]: E0310 15:49:48.998915 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.099478 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.200565 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.301626 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.402440 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.503499 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.604640 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.705064 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.806341 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:49 crc kubenswrapper[4749]: E0310 15:49:49.907443 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.007879 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.108323 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.209576 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.310563 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.345750 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.413082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.413149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.413161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.413190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.413203 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:50Z","lastTransitionTime":"2026-03-10T15:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.516174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.516218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.516229 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.516248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.516259 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:50Z","lastTransitionTime":"2026-03-10T15:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.575449 4749 apiserver.go:52] "Watching apiserver" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.580936 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.581565 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-nvpsq","openshift-multus/multus-additional-cni-plugins-tp7tp","openshift-multus/network-metrics-daemon-jpmqp","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7","openshift-dns/node-resolver-r8l57","openshift-network-diagnostics/network-check-target-xd92c","openshift-image-registry/node-ca-j4tr6","openshift-machine-config-operator/machine-config-daemon-p7rts","openshift-multus/multus-gwpmf","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.582098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.582263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.582369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.582442 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.582275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.583292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.584620 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.584790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.584892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.584965 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.585052 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.585523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.585579 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.585774 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.585846 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.586946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.587917 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.588700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.589252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.588860 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.593691 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.593947 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.594126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.594450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.598900 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601399 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601549 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601586 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601785 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601844 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601876 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601920 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.601976 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.602125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.602201 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.602274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.602317 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.604499 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.604545 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605060 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605254 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605283 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605462 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605569 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605772 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605790 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605873 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605900 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605925 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.605978 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.606125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.606144 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.618473 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.623246 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.623454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.623489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.623499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.623521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.623533 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:50Z","lastTransitionTime":"2026-03-10T15:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.635078 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.649189 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.650540 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.661459 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.672779 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.682438 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.691955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692196 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692231 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692293 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692396 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692426 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692560 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692670 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692726 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692828 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.692946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693061 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693408 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693467 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693493 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693623 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693640 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693659 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693713 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693729 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.693991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694016 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694067 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694191 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694268 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694316 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694317 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694906 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.694974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695067 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695192 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695260 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.696911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697084 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.698275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.697873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.698708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699003 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699245 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699265 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699443 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.695317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.699969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700071 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700129 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700215 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700246 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700361 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700881 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700912 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700939 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701122 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701277 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701323 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701398 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701426 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701450 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701500 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701755 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701793 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702233 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702268 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702529 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702610 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-cni-multus\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0636061-098d-4b79-b24d-ae0e070c8b17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f483ba5-0e39-43a8-b651-9db5308235d8-serviceca\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07b78914-24ae-4dc3-a640-23ade3cb9d39-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-ovn\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-bin\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sj9k\" (UniqueName: \"kubernetes.io/projected/807d12f5-c95a-4a7e-91c5-128de3d2235c-kube-api-access-4sj9k\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-systemd-units\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-hostroot\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0636061-098d-4b79-b24d-ae0e070c8b17-cni-binary-copy\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702975 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnzp\" (UniqueName: \"kubernetes.io/projected/5f483ba5-0e39-43a8-b651-9db5308235d8-kube-api-access-nwnzp\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-log-socket\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-script-lib\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88x2\" (UniqueName: \"kubernetes.io/projected/07b78914-24ae-4dc3-a640-23ade3cb9d39-kube-api-access-x88x2\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eadca31d-151b-4569-8c6f-71ce4a6f0d8e-hosts-file\") pod \"node-resolver-r8l57\" (UID: \"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\") " pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-k8s-cni-cncf-io\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7ml\" (UniqueName: \"kubernetes.io/projected/eadca31d-151b-4569-8c6f-71ce4a6f0d8e-kube-api-access-bq7ml\") pod \"node-resolver-r8l57\" (UID: \"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\") " pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-os-release\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07b78914-24ae-4dc3-a640-23ade3cb9d39-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-multus-certs\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-etc-kubernetes\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-cnibin\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07b78914-24ae-4dc3-a640-23ade3cb9d39-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-system-cni-dir\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705324 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-env-overrides\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-config\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-etc-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-cni-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705657 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zktgg\" (UniqueName: \"kubernetes.io/projected/e0636061-098d-4b79-b24d-ae0e070c8b17-kube-api-access-zktgg\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nz5z\" (UniqueName: \"kubernetes.io/projected/cd3985af-f2c3-4f91-919e-2ea9420418b3-kube-api-access-7nz5z\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovn-node-metrics-cert\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-system-cni-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-systemd\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-proxy-tls\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-cni-bin\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8gk7\" (UniqueName: \"kubernetes.io/projected/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-kube-api-access-s8gk7\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706135 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-kubelet\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-daemon-config\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-var-lib-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-rootfs\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807d12f5-c95a-4a7e-91c5-128de3d2235c-cni-binary-copy\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-socket-dir-parent\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-netns\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-netns\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-node-log\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-cnibin\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-conf-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-kubelet\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-slash\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2c5\" (UniqueName: \"kubernetes.io/projected/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-kube-api-access-qk2c5\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-os-release\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f483ba5-0e39-43a8-b651-9db5308235d8-host\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-netd\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707026 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707046 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707063 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707086 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707104 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707119 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707145 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707169 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707183 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707199 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707216 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707237 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707254 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707271 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707294 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707312 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707327 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707344 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707367 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707397 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707413 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707427 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707449 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707463 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707480 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707495 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707514 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707532 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707547 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707567 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707582 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707598 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707612 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707633 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707648 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707664 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710598 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710648 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710671 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710879 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710895 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710915 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.709800 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.700920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701434 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701612 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.701773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702043 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.711254 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702411 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.702914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703030 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703023 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.703076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704850 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.704973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705454 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.705598 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706012 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706754 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.706974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707028 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.707922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708154 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708346 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708640 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708811 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.708832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.709095 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.709315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.710983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.711639 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.711666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712239 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712234 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712738 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.712958 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.713179 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.713296 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.713333 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.713342 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:51.213309241 +0000 UTC m=+88.335174928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.713361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.713675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.713913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.714121 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.714133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.714466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.714671 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.714846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.714948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.715168 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.715422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.715705 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.715742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.716019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.716249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.716439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.716502 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.717061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.717252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.717464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.717915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.718997 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719754 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.719896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720173 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720313 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720254 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.720819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.721157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.721436 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:49:51.221404283 +0000 UTC m=+88.343269980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.722183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.722267 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:51.222236525 +0000 UTC m=+88.344102212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.727606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.728756 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.728790 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.728813 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.729194 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:51.229161925 +0000 UTC m=+88.351027612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.730117 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.730336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.730526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.730624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.730906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.731048 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:50Z","lastTransitionTime":"2026-03-10T15:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.734661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.735458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.735588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.736776 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.736866 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.737522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.737852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.738939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.739733 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.739770 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.739789 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.739852 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:51.239832497 +0000 UTC m=+88.361698184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.741919 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.742172 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.742365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.743501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.743922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.744292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.744617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.744793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.744875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.744963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.745290 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.746717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.746922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.747006 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.748215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.748305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.748479 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.749675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.749736 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.749897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.749923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.750611 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.750844 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.750842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.750892 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.751026 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.751417 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.751457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.751748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.751916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.752083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.752159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.752106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.752465 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.753369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.753891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.759680 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.763118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.764404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.767910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.771366 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.774819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.781875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.791630 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.803786 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.811519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8gk7\" (UniqueName: \"kubernetes.io/projected/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-kube-api-access-s8gk7\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-proxy-tls\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-cni-bin\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-rootfs\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-kubelet\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-daemon-config\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812285 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-cni-bin\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-var-lib-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-var-lib-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-netns\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-netns\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807d12f5-c95a-4a7e-91c5-128de3d2235c-cni-binary-copy\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812445 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-kubelet\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-socket-dir-parent\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-slash\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-rootfs\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-netns\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-node-log\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-cnibin\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-conf-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-kubelet\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-netd\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2c5\" (UniqueName: \"kubernetes.io/projected/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-kube-api-access-qk2c5\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-os-release\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f483ba5-0e39-43a8-b651-9db5308235d8-host\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f483ba5-0e39-43a8-b651-9db5308235d8-serviceca\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-cni-multus\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0636061-098d-4b79-b24d-ae0e070c8b17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sj9k\" (UniqueName: \"kubernetes.io/projected/807d12f5-c95a-4a7e-91c5-128de3d2235c-kube-api-access-4sj9k\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-conf-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07b78914-24ae-4dc3-a640-23ade3cb9d39-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-netns\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.812970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-ovn\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-bin\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-socket-dir-parent\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-node-log\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwnzp\" (UniqueName: \"kubernetes.io/projected/5f483ba5-0e39-43a8-b651-9db5308235d8-kube-api-access-nwnzp\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-slash\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-netd\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-cnibin\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-kubelet\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-ovn\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-systemd-units\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-hostroot\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-systemd-units\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0636061-098d-4b79-b24d-ae0e070c8b17-cni-binary-copy\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eadca31d-151b-4569-8c6f-71ce4a6f0d8e-hosts-file\") pod \"node-resolver-r8l57\" (UID: \"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\") " pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-log-socket\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-script-lib\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-os-release\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-var-lib-cni-multus\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x88x2\" (UniqueName: \"kubernetes.io/projected/07b78914-24ae-4dc3-a640-23ade3cb9d39-kube-api-access-x88x2\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807d12f5-c95a-4a7e-91c5-128de3d2235c-cni-binary-copy\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-os-release\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-k8s-cni-cncf-io\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-os-release\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7ml\" (UniqueName: \"kubernetes.io/projected/eadca31d-151b-4569-8c6f-71ce4a6f0d8e-kube-api-access-bq7ml\") pod \"node-resolver-r8l57\" (UID: \"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\") " pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-etc-kubernetes\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f483ba5-0e39-43a8-b651-9db5308235d8-host\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07b78914-24ae-4dc3-a640-23ade3cb9d39-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-multus-certs\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-k8s-cni-cncf-io\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-system-cni-dir\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-cnibin\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.813964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07b78914-24ae-4dc3-a640-23ade3cb9d39-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-config\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-env-overrides\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-script-lib\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-etc-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-cni-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zktgg\" (UniqueName: \"kubernetes.io/projected/e0636061-098d-4b79-b24d-ae0e070c8b17-kube-api-access-zktgg\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nz5z\" (UniqueName: \"kubernetes.io/projected/cd3985af-f2c3-4f91-919e-2ea9420418b3-kube-api-access-7nz5z\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814304 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovn-node-metrics-cert\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-system-cni-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-systemd\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f483ba5-0e39-43a8-b651-9db5308235d8-serviceca\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-daemon-config\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.813602 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-log-socket\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-multus-cni-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: E0310 15:49:50.814597 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:51.314578685 +0000 UTC m=+88.436444582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-etc-kubernetes\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07b78914-24ae-4dc3-a640-23ade3cb9d39-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.814904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815201 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-host-run-multus-certs\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-system-cni-dir\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-proxy-tls\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-etc-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-bin\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/807d12f5-c95a-4a7e-91c5-128de3d2235c-hostroot\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.815742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-env-overrides\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0636061-098d-4b79-b24d-ae0e070c8b17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-openvswitch\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-cnibin\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-systemd\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eadca31d-151b-4569-8c6f-71ce4a6f0d8e-hosts-file\") pod \"node-resolver-r8l57\" (UID: \"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\") " pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816306 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-system-cni-dir\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0636061-098d-4b79-b24d-ae0e070c8b17-cni-binary-copy\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-config\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0636061-098d-4b79-b24d-ae0e070c8b17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816706 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816734 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.816750 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07b78914-24ae-4dc3-a640-23ade3cb9d39-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07b78914-24ae-4dc3-a640-23ade3cb9d39-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817283 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817317 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817329 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817342 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817356 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817369 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817407 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817424 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817437 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817450 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817466 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817480 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817493 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817507 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817521 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817536 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817550 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817563 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817579 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817593 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817606 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817618 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817629 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817639 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817650 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817660 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817670 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817680 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817692 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817703 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817716 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817726 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817738 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817748 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817762 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817775 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817788 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817802 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817815 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817827 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817843 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817856 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817868 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817884 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.817982 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818002 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818017 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818030 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818045 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818059 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818072 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818085 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818099 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818112 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818126 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818141 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818160 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818176 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818186 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818198 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818209 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818219 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818253 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818264 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818275 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818288 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818297 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818311 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818320 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818330 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818341 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818350 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818360 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818393 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818405 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818420 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818434 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818444 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818453 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818463 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818473 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818483 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818493 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818503 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818518 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818526 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818535 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818546 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818560 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818629 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.818639 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819010 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819026 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819042 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819054 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819066 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819080 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819093 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819104 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819117 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819129 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819139 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819149 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819159 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819170 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819181 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819191 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819207 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819221 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819234 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819248 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819260 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819274 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819285 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819298 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819311 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819324 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819338 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819351 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819367 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819406 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819419 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819434 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819449 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819463 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819475 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819513 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819527 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819540 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819556 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819572 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819586 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819602 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819616 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819641 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819654 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819666 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819680 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819695 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819708 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819720 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819732 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819744 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819756 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819767 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819779 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovn-node-metrics-cert\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.819807 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.820273 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.831030 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.833489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88x2\" (UniqueName: \"kubernetes.io/projected/07b78914-24ae-4dc3-a640-23ade3cb9d39-kube-api-access-x88x2\") pod \"ovnkube-control-plane-749d76644c-pp7d7\" (UID: \"07b78914-24ae-4dc3-a640-23ade3cb9d39\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.833659 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zktgg\" (UniqueName: \"kubernetes.io/projected/e0636061-098d-4b79-b24d-ae0e070c8b17-kube-api-access-zktgg\") pod \"multus-additional-cni-plugins-tp7tp\" (UID: \"e0636061-098d-4b79-b24d-ae0e070c8b17\") " pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.833684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwnzp\" (UniqueName: \"kubernetes.io/projected/5f483ba5-0e39-43a8-b651-9db5308235d8-kube-api-access-nwnzp\") pod \"node-ca-j4tr6\" (UID: \"5f483ba5-0e39-43a8-b651-9db5308235d8\") " pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.835661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.835685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.835697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.835718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.835731 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:50Z","lastTransitionTime":"2026-03-10T15:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.836194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8gk7\" (UniqueName: \"kubernetes.io/projected/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-kube-api-access-s8gk7\") pod \"ovnkube-node-nvpsq\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.836615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2c5\" (UniqueName: \"kubernetes.io/projected/ebcbc0fc-15f3-4e4e-ae14-832adec8da50-kube-api-access-qk2c5\") pod \"machine-config-daemon-p7rts\" (UID: \"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\") " pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.837095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7ml\" (UniqueName: \"kubernetes.io/projected/eadca31d-151b-4569-8c6f-71ce4a6f0d8e-kube-api-access-bq7ml\") pod \"node-resolver-r8l57\" (UID: \"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\") " pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.837168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nz5z\" (UniqueName: \"kubernetes.io/projected/cd3985af-f2c3-4f91-919e-2ea9420418b3-kube-api-access-7nz5z\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.838282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sj9k\" (UniqueName: \"kubernetes.io/projected/807d12f5-c95a-4a7e-91c5-128de3d2235c-kube-api-access-4sj9k\") pod \"multus-gwpmf\" (UID: \"807d12f5-c95a-4a7e-91c5-128de3d2235c\") " pod="openshift-multus/multus-gwpmf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.842516 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.853132 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.904910 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.920084 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.932054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.939132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.939196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.939211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.939234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.939531 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:50Z","lastTransitionTime":"2026-03-10T15:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.944975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r8l57" Mar 10 15:49:50 crc kubenswrapper[4749]: W0310 15:49:50.948117 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-665fd7ae72695b73d004a92b44b22794dc5eac553f3a790c9735cf7db4daca66 WatchSource:0}: Error finding container 665fd7ae72695b73d004a92b44b22794dc5eac553f3a790c9735cf7db4daca66: Status 404 returned error can't find the container with id 665fd7ae72695b73d004a92b44b22794dc5eac553f3a790c9735cf7db4daca66 Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.953754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" Mar 10 15:49:50 crc kubenswrapper[4749]: W0310 15:49:50.959450 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeadca31d_151b_4569_8c6f_71ce4a6f0d8e.slice/crio-837ddafce7c07370be18a1b514264bf0963084c1976e84a9a6b0c96e97f1046d WatchSource:0}: Error finding container 837ddafce7c07370be18a1b514264bf0963084c1976e84a9a6b0c96e97f1046d: Status 404 returned error can't find the container with id 837ddafce7c07370be18a1b514264bf0963084c1976e84a9a6b0c96e97f1046d Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.960600 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.962609 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0cc63f939c58bd21aef7db47e18f967afaf09e1d35aef3b2e14d1818e140cd57"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.967259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"665fd7ae72695b73d004a92b44b22794dc5eac553f3a790c9735cf7db4daca66"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.969094 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7aae3a5ed428ab6b3eadac7bc31bbb6ed629f0f2ed0ab75ccac2b7ed440ac54e"} Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.971299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j4tr6" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.979525 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:49:50 crc kubenswrapper[4749]: W0310 15:49:50.981179 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b78914_24ae_4dc3_a640_23ade3cb9d39.slice/crio-af65fedb7b0280c5cda7540045b372a7bab2083cd71e57aae9d995204cc66bff WatchSource:0}: Error finding container af65fedb7b0280c5cda7540045b372a7bab2083cd71e57aae9d995204cc66bff: Status 404 returned error can't find the container with id af65fedb7b0280c5cda7540045b372a7bab2083cd71e57aae9d995204cc66bff Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.986545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:50 crc kubenswrapper[4749]: I0310 15:49:50.998274 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gwpmf" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.044921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.044977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.044991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.045013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.045028 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: W0310 15:49:51.049474 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcbc0fc_15f3_4e4e_ae14_832adec8da50.slice/crio-2ffe79327a4aeaf2eb02604b5e3966e2fb48a9c0489a45e6bcbad220cc08c05f WatchSource:0}: Error finding container 2ffe79327a4aeaf2eb02604b5e3966e2fb48a9c0489a45e6bcbad220cc08c05f: Status 404 returned error can't find the container with id 2ffe79327a4aeaf2eb02604b5e3966e2fb48a9c0489a45e6bcbad220cc08c05f Mar 10 15:49:51 crc kubenswrapper[4749]: W0310 15:49:51.053214 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807d12f5_c95a_4a7e_91c5_128de3d2235c.slice/crio-30113cd5d3837fc530c90af2c513bfa614573d8c500991ec6dc4351e4cf04023 WatchSource:0}: Error finding container 30113cd5d3837fc530c90af2c513bfa614573d8c500991ec6dc4351e4cf04023: Status 404 returned error can't find the container with id 30113cd5d3837fc530c90af2c513bfa614573d8c500991ec6dc4351e4cf04023 Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.149169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.149226 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.149239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.149263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.149576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.222743 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.222923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.222979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.223073 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:49:52.223044225 +0000 UTC m=+89.344909912 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.223133 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.223176 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.223252 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:52.22321351 +0000 UTC m=+89.345079377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.223293 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:52.223263801 +0000 UTC m=+89.345129678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.252221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.252264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.252275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.252295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.252305 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.324010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.324075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.324099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324268 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324301 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324316 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324296 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324268 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324417 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324430 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324395 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:52.32435433 +0000 UTC m=+89.446220017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324561 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:52.324498504 +0000 UTC m=+89.446364191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:51 crc kubenswrapper[4749]: E0310 15:49:51.324604 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:52.324591947 +0000 UTC m=+89.446457794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.354961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.355027 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.355073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.355096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.355112 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.458463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.458515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.458527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.458549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.458564 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.561562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.561621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.561634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.561656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.561669 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.612846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.613599 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.615341 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.616087 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.617414 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.618078 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.618720 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.619820 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.620473 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.621490 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.622072 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.623569 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.624188 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.624800 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.625834 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.626675 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.627824 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.628593 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.629194 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.630333 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.631124 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.632454 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.632988 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.634268 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.634881 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.635914 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.637171 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.637731 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.639104 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.639758 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.640783 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.640901 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.642666 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.643808 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.644473 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.646660 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.648159 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.649227 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.649957 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.651207 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.652087 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.653212 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.654002 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.655367 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.656044 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.657060 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.657629 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.658941 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.659553 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.660585 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.661158 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.661768 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.662799 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.663341 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.664662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.664717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.664742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.664767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.664780 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.767799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.767858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.767873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.767895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.767909 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.870992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.871051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.871069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.871091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.871106 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.972739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.972774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.972782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.972797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.972807 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:51Z","lastTransitionTime":"2026-03-10T15:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.977790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.977846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.979118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerStarted","Data":"48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.979161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerStarted","Data":"2d7590f200d2c59cfee98cb7a3d7d2070108960770dd4cfb555fd7a0194f7736"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.980741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r8l57" event={"ID":"eadca31d-151b-4569-8c6f-71ce4a6f0d8e","Type":"ContainerStarted","Data":"56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.980783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r8l57" event={"ID":"eadca31d-151b-4569-8c6f-71ce4a6f0d8e","Type":"ContainerStarted","Data":"837ddafce7c07370be18a1b514264bf0963084c1976e84a9a6b0c96e97f1046d"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.982502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.982539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.982549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"2ffe79327a4aeaf2eb02604b5e3966e2fb48a9c0489a45e6bcbad220cc08c05f"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.983694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j4tr6" event={"ID":"5f483ba5-0e39-43a8-b651-9db5308235d8","Type":"ContainerStarted","Data":"dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.983715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j4tr6" event={"ID":"5f483ba5-0e39-43a8-b651-9db5308235d8","Type":"ContainerStarted","Data":"a6197bcf0bbf7a6b9d1c0cd894c3a03d32000dc990931539d6e7575cb67ff94a"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.985792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.987497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerStarted","Data":"78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.987535 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerStarted","Data":"30113cd5d3837fc530c90af2c513bfa614573d8c500991ec6dc4351e4cf04023"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.989353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" event={"ID":"07b78914-24ae-4dc3-a640-23ade3cb9d39","Type":"ContainerStarted","Data":"6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.989406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" event={"ID":"07b78914-24ae-4dc3-a640-23ade3cb9d39","Type":"ContainerStarted","Data":"0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.989422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" event={"ID":"07b78914-24ae-4dc3-a640-23ade3cb9d39","Type":"ContainerStarted","Data":"af65fedb7b0280c5cda7540045b372a7bab2083cd71e57aae9d995204cc66bff"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.991225 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" exitCode=0 Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.991256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.991274 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"399d8a5766b643b87d73c3fec0ecfa3587f68422cd4590de1f592ecd380e4f04"} Mar 10 15:49:51 crc kubenswrapper[4749]: I0310 15:49:51.993401 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.013491 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.031594 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.043801 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.056228 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.072543 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.082075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.082128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.082141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.082162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.082175 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.083270 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.095436 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.119027 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.131817 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.146245 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.161632 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.174436 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.185207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.185253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.185264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.185283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.185295 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.186967 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.197837 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.218321 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.230516 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.237290 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.237482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.237531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.237690 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:49:54.237644539 +0000 UTC m=+91.359510226 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.237711 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.237774 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.237826 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:54.237814094 +0000 UTC m=+91.359680001 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.237866 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:54.237846965 +0000 UTC m=+91.359712892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.242350 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.257298 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.264704 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.272933 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.281993 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.287999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.288050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.288065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.288084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.288097 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.290509 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.302008 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.318950 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.332544 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.338966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.339014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.339052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339169 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339235 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:54.339215962 +0000 UTC m=+91.461081649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339170 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339265 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339339 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339339 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339744 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339765 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339389 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:54.339368236 +0000 UTC m=+91.461233923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.339885 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:54.33985591 +0000 UTC m=+91.461721647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.346088 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.357162 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.390603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.390926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.390934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.390950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.390960 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.493794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.493832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.493846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.493864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.493877 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.595792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.595849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.595862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.595884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.595897 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.606354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.606432 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.606488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.606496 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.606363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.606602 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.606712 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:49:52 crc kubenswrapper[4749]: E0310 15:49:52.606797 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.698654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.698688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.698699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.698720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.698733 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.801290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.801333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.801342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.801361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.801388 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.903957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.903996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.904005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.904021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:52 crc kubenswrapper[4749]: I0310 15:49:52.904031 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:52Z","lastTransitionTime":"2026-03-10T15:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.004030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.004396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.004496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.004575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006614 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006282 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0636061-098d-4b79-b24d-ae0e070c8b17" containerID="48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503" exitCode=0 Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.006324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerDied","Data":"48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.024123 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.038004 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.057320 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.073743 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.087139 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.106547 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.111324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.111558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.111661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.111691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.111708 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.122175 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.138365 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.155608 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.173127 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.185595 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.200610 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.215961 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.217055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.217095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.217109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.217129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.217141 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.240278 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.319720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.319766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.319779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.319802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.319816 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.422042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.422102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.422114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.422135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.422149 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.525352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.525420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.525435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.525454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.525464 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.625064 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.628297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.628348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.628360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.628397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.628416 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.631360 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.632057 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:49:53 crc kubenswrapper[4749]: E0310 15:49:53.632701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.639673 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.654321 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.668118 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.682475 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.708807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.730855 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.731004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.731595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.731610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.731631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.731644 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.749078 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.768601 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.785391 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.802181 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.818439 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.834484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.835086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.835105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.835129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.835141 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.837845 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.855703 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.937763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.937817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.937828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.937850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:53 crc kubenswrapper[4749]: I0310 15:49:53.937864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:53Z","lastTransitionTime":"2026-03-10T15:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.012342 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0636061-098d-4b79-b24d-ae0e070c8b17" containerID="f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49" exitCode=0 Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.012430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerDied","Data":"f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.022820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.022879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.028176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.028485 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.028850 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.033617 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.048413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.048455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.048466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.048485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.048499 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.053457 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.075242 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.100015 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.120428 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.135548 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.154140 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.156413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.156499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.156521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.156542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.156576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.182524 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.198250 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.217899 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.233512 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.247056 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.260405 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:49:58.260361442 +0000 UTC m=+95.382227129 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260415 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.260832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.261008 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.261086 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.261115 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:58.261087271 +0000 UTC m=+95.382953128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.261168 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:58.261145643 +0000 UTC m=+95.383011520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.265013 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.284492 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.304988 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.322179 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.340531 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.358928 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.361646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.361723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.361749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.361924 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.361950 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.361965 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362023 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:58.362005456 +0000 UTC m=+95.483871143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362089 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362111 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:58.362103439 +0000 UTC m=+95.483969126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362169 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362183 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362191 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.362212 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:49:58.362206181 +0000 UTC m=+95.484071868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.363823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.363858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.363885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.363903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.363913 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.377045 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.390007 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.404727 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.422337 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.439596 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.455890 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.467504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.467576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.467590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.467613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.467632 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.477718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.491224 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.506619 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.522416 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.534613 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.559257 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.570232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.570489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.570500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.570518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.570531 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.606528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.606588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.607081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.606716 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.606648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.607252 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.607328 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.607433 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.624312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.624368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.624410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.624432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.624447 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.638408 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.642518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.642569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.642582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.642602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.642616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.657575 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.661792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.661832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.661844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.661863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.661877 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.676342 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.683840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.683893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.683911 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.683933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.683946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.698986 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.702863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.702916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.702928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.702951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.702964 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.717325 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:54 crc kubenswrapper[4749]: E0310 15:49:54.717519 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.719107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.719140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.719151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.719168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.719180 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.821923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.821959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.821970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.821988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.822000 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.925426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.925493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.925506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.925530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:54 crc kubenswrapper[4749]: I0310 15:49:54.925548 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:54Z","lastTransitionTime":"2026-03-10T15:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.028553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.029549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.029583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.029636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.029657 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.036504 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0636061-098d-4b79-b24d-ae0e070c8b17" containerID="a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748" exitCode=0 Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.036574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerDied","Data":"a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.055591 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.071222 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.087582 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.104284 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.125049 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.132516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.132575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.132589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.132609 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.132621 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.136768 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.151635 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.162324 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.181408 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.201275 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.221444 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.234850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.234897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.234909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.234930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.234944 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.235822 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.252964 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.269327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.283163 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.337856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.337920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.337941 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.337967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.337981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.440634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.440682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.440693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.440713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.440726 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.543727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.543773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.543783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.543801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.543813 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.647113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.647181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.647196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.647217 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.647233 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.750306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.750339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.750349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.750365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.750390 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.853216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.853270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.853282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.853304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.853319 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.955701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.955779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.955792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.955813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:55 crc kubenswrapper[4749]: I0310 15:49:55.955827 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:55Z","lastTransitionTime":"2026-03-10T15:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.043325 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0636061-098d-4b79-b24d-ae0e070c8b17" containerID="21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c" exitCode=0 Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.043457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerDied","Data":"21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.047598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.059505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.059535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.059582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.059600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.059612 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.063746 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.079043 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.091719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.103197 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.115678 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.138799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.155514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.161684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.161761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.161775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.161798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.161809 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.172850 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.188623 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.204861 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.219790 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.238004 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.253534 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.263768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.263816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.263829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.263849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.263862 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.271484 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.290728 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.366825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.366867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.366876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.366892 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.366902 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.470422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.470868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.470885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.470907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.470919 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.573827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.573868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.573878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.573902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.573913 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.606250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.606363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.606362 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.606435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:56 crc kubenswrapper[4749]: E0310 15:49:56.606471 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:49:56 crc kubenswrapper[4749]: E0310 15:49:56.606634 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:49:56 crc kubenswrapper[4749]: E0310 15:49:56.606772 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:49:56 crc kubenswrapper[4749]: E0310 15:49:56.606861 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.676724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.676760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.676769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.676789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.676799 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.778880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.778924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.778936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.778954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.778966 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.882010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.882062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.882082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.882104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.882117 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.984811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.984852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.984862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.984877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:56 crc kubenswrapper[4749]: I0310 15:49:56.984888 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:56Z","lastTransitionTime":"2026-03-10T15:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.054626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerStarted","Data":"3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.074714 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.087147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.087478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.087547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.087625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.087532 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.087692 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.101953 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.115894 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.131851 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.144039 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.155970 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.171758 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.190295 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.190958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.191002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.191017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.191036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.191050 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.204977 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.219482 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.234148 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.248357 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.267015 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.283394 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.294420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.294476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.294488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.294506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.294519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.397341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.397423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.397437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.397458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.397472 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.501285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.501332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.501343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.501367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.501431 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.603971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.604023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.604036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.604054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.604066 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.707678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.707732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.707742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.707760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.707769 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.810634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.810677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.810689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.810705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.810714 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.913471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.913512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.913523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.913540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:57 crc kubenswrapper[4749]: I0310 15:49:57.913550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:57Z","lastTransitionTime":"2026-03-10T15:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.016950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.017006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.017021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.017080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.017095 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.120356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.120426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.120440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.120458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.120480 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.222800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.222833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.222844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.222873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.222884 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.309457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.309724 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:50:06.309683044 +0000 UTC m=+103.431548741 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.309795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.309892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.309958 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.310021 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.310109 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:06.310055984 +0000 UTC m=+103.431921891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.310188 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:06.310131346 +0000 UTC m=+103.431997033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.325633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.325687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.325701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.325722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.325735 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.411082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.411165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.411214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411359 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411428 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411478 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:06.411449111 +0000 UTC m=+103.533314798 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411444 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411506 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411522 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411606 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:06.411578675 +0000 UTC m=+103.533444552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411486 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411655 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.411711 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:06.411698198 +0000 UTC m=+103.533564085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.428563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.428615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.428627 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.428646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.428660 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.531158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.531213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.531232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.531255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.531268 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.606439 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.606476 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.606428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.606611 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.606627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.606684 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.606797 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:49:58 crc kubenswrapper[4749]: E0310 15:49:58.606893 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.634265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.634297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.634306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.634323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.634332 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.737618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.737675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.737685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.737703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.737715 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.839937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.839976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.839988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.840005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.840016 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.942955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.943008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.943021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.943046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:58 crc kubenswrapper[4749]: I0310 15:49:58.943059 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:58Z","lastTransitionTime":"2026-03-10T15:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.045771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.045816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.045826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.045850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.045861 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.070028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.070614 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.070728 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.070788 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.076063 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0636061-098d-4b79-b24d-ae0e070c8b17" containerID="3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf" exitCode=0 Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.076135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerDied","Data":"3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.091042 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.106791 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.119176 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.119247 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.120941 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.135442 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.147821 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.149151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.149196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.149208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.149228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.149248 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.167011 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.179852 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.192836 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.205727 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.219155 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.233807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.243975 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.252043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.252108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.252127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.252147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.252157 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.260458 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.273641 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.288048 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.303104 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.315064 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.328619 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.342841 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.355586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.355645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.355656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.355675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.355686 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.356160 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.380183 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.399895 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.417987 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.432422 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.447470 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.459416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.459475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.459491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.459512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.459529 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.464258 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.497008 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.513897 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.529475 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.543639 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:49:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.576644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.576713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.576724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.576744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.576761 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.679624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.679673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.679685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.679703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.679714 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.782600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.782645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.782658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.782678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.782691 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.885272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.885319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.885334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.885351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.885365 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.988342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.988403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.988417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.988434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:49:59 crc kubenswrapper[4749]: I0310 15:49:59.988446 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:49:59Z","lastTransitionTime":"2026-03-10T15:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.092311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.092371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.092405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.092429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.092444 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.194597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.194665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.194680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.194701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.194718 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.297328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.297449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.297468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.297487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.297500 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.400616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.400653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.400666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.400685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.400696 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.503803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.503850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.503861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.503879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.503888 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.605718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.605760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.605807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.605733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:00 crc kubenswrapper[4749]: E0310 15:50:00.605885 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:00 crc kubenswrapper[4749]: E0310 15:50:00.606004 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:00 crc kubenswrapper[4749]: E0310 15:50:00.606137 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:00 crc kubenswrapper[4749]: E0310 15:50:00.606252 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.606817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.606844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.606855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.606869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.606881 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.709835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.709889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.709902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.709922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.709933 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.812617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.812687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.812701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.813138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.813182 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.915631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.915681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.915694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.915714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:00 crc kubenswrapper[4749]: I0310 15:50:00.915729 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:00Z","lastTransitionTime":"2026-03-10T15:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.018057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.018113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.018126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.018147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.018161 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.089008 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0636061-098d-4b79-b24d-ae0e070c8b17" containerID="d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63" exitCode=0 Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.089148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerDied","Data":"d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.109039 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.122729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.123176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.123197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.123221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.123236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.128061 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.143754 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.163932 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.177564 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.190085 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.206252 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.218149 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.225495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.225547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.225558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.225577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.225589 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.232076 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.243698 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.255421 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.269354 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.286055 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.303249 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.328106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.328157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.328169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.328188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.328203 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.329535 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.431191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.431249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.431266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.431293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.431309 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.534188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.534233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.534244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.534264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.534277 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.637305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.637364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.637397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.637422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.637437 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.740096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.740140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.740150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.740167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.740181 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.842870 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.842916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.842926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.842947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.842960 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.945930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.946107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.946166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.946188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:01 crc kubenswrapper[4749]: I0310 15:50:01.946200 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:01Z","lastTransitionTime":"2026-03-10T15:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.049151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.049188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.049199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.049218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.049228 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.096925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" event={"ID":"e0636061-098d-4b79-b24d-ae0e070c8b17","Type":"ContainerStarted","Data":"f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.110866 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.123029 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.135633 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.148933 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.153449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.153485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.153496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.153517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.153530 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.164596 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.186588 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.204239 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.220896 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.237007 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.254904 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.257522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.257576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.257589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.257614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.257629 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.269181 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.287072 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.303142 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.334006 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.347663 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.360233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.360265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.360275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.360292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.360303 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.462636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.462672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.462682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.462701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.462711 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.565874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.565920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.565930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.565951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.565962 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.606523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.606639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.606648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.606777 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:02 crc kubenswrapper[4749]: E0310 15:50:02.606784 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:02 crc kubenswrapper[4749]: E0310 15:50:02.606961 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:02 crc kubenswrapper[4749]: E0310 15:50:02.607494 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:02 crc kubenswrapper[4749]: E0310 15:50:02.607649 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.669901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.669942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.669952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.669967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.669978 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.773002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.773318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.773422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.773713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.773816 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.876993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.877046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.877058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.877090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.877110 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.978998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.979039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.979049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.979066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:02 crc kubenswrapper[4749]: I0310 15:50:02.979076 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:02Z","lastTransitionTime":"2026-03-10T15:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.081312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.081360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.081398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.081421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.081440 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.101064 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/0.log" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.103507 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64" exitCode=1 Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.103545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.104466 4749 scope.go:117] "RemoveContainer" containerID="1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.121601 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.136063 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.150643 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.166691 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.181401 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.183515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.183555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.183567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.183586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.183600 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.195443 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.211245 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.224070 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.243926 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"message\\\":\\\"aselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:50:02.460785 6543 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.460933 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461248 6543 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461630 6543 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461723 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.462573 6543 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:50:02.462594 6543 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:50:02.462608 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:02.462641 6543 factory.go:656] Stopping watch factory\\\\nI0310 15:50:02.462660 6543 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:50:02.462690 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:02.462701 6543 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:50:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.260768 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.278482 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.285711 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.285749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.285760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.285780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.285791 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.293399 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.320316 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.331828 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.345501 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.388366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.388423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.388432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.388449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.388458 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.490784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.490851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.490861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.490882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.490894 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.593634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.593684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.593694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.593710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.593720 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.619917 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.643823 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"message\\\":\\\"aselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:50:02.460785 6543 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.460933 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461248 6543 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461630 6543 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461723 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.462573 6543 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:50:02.462594 6543 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:50:02.462608 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:02.462641 6543 factory.go:656] Stopping watch factory\\\\nI0310 15:50:02.462660 6543 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:50:02.462690 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:02.462701 6543 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:50:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.659150 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.680073 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.695695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.695756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.695767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.695788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.695799 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.698099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.710360 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.724774 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.739648 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.754234 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.769918 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.784099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.798628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.798684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.798699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.798719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.798732 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.798954 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.810634 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.826202 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.841654 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:03Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.900917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.900974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.900990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.901010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:03 crc kubenswrapper[4749]: I0310 15:50:03.901024 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:03Z","lastTransitionTime":"2026-03-10T15:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.003944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.003996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.004008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.004029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.004042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.106901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.106962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.106979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.107010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.107029 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.113206 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/0.log" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.119620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.120217 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.140603 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.155166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.168288 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.184526 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.195272 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.209070 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.210055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.210091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.210106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.210127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.210142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.223781 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.235718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.249587 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.261667 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.274550 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.285893 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.300905 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.312430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.312739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.312815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.312961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.313052 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.313998 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.340731 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"message\\\":\\\"aselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:50:02.460785 6543 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.460933 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461248 6543 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461630 6543 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461723 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.462573 6543 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:50:02.462594 6543 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:50:02.462608 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:02.462641 6543 factory.go:656] Stopping watch factory\\\\nI0310 15:50:02.462660 6543 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:50:02.462690 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:02.462701 6543 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:50:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:04Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.416159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.416212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.416222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.416240 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.416253 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.519491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.519545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.519560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.519579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.519592 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.606013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.606063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.606061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.606013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:04 crc kubenswrapper[4749]: E0310 15:50:04.606187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:04 crc kubenswrapper[4749]: E0310 15:50:04.606236 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:04 crc kubenswrapper[4749]: E0310 15:50:04.606454 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:04 crc kubenswrapper[4749]: E0310 15:50:04.606565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.622674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.622724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.622733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.622753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.622764 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.725305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.725364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.725398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.725421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.725435 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.827484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.827566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.827579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.827598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.827875 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.930155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.930193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.930206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.930226 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:04 crc kubenswrapper[4749]: I0310 15:50:04.930236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:04Z","lastTransitionTime":"2026-03-10T15:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.033075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.033137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.033148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.033169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.033180 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.119271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.119337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.119361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.119412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.119452 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: E0310 15:50:05.140820 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.145837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.145897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.145907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.145923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.145933 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: E0310 15:50:05.160034 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.163984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.164034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.164046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.164066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.164080 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: E0310 15:50:05.181972 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.187668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.187721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.187741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.187768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.187789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: E0310 15:50:05.204134 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.208513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.208563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.208576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.208603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.208616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: E0310 15:50:05.220978 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:05 crc kubenswrapper[4749]: E0310 15:50:05.221129 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.223154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.223195 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.223206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.223225 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.223236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.325987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.326445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.326457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.326480 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.326493 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.429993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.430054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.430065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.430081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.430091 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.533131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.533177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.533189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.533212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.533226 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.636060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.636094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.636102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.636119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.636128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.738232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.738283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.738295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.738318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.738331 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.841330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.841362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.841391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.841425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.841439 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.944411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.944456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.944469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.944492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:05 crc kubenswrapper[4749]: I0310 15:50:05.944506 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:05Z","lastTransitionTime":"2026-03-10T15:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.047247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.047302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.047316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.047336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.047350 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.128457 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/1.log" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.129128 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/0.log" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.133154 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa" exitCode=1 Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.133201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.133241 4749 scope.go:117] "RemoveContainer" containerID="1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.134127 4749 scope.go:117] "RemoveContainer" containerID="98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.134300 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.149923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.149982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.149993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.150011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.150021 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.153323 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.172857 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.188780 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.208624 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.232430 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"message\\\":\\\"aselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:50:02.460785 6543 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.460933 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461248 6543 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461630 6543 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461723 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.462573 6543 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:50:02.462594 6543 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:50:02.462608 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:02.462641 6543 factory.go:656] Stopping watch factory\\\\nI0310 15:50:02.462660 6543 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:50:02.462690 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:02.462701 6543 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:50:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\" 6726 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:05.248862 6726 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:50:05.248894 6726 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:05.248935 6726 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:50:05.248941 6726 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:50:05.248940 6726 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:50:05.248977 6726 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:50:05.248981 6726 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:50:05.248998 6726 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:50:05.249038 6726 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:50:05.249055 6726 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:50:05.249064 6726 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:50:05.249072 6726 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:50:05.249118 6726 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:50:05.249124 6726 factory.go:656] Stopping watch factory\\\\nI0310 15:50:05.249132 6726 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.245724 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.252329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.252402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.252416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.252438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.252454 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.261238 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.272938 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.288251 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.300411 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.312918 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.329073 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.341711 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.353216 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.354821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.354861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.354871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.354892 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.354905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.365803 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:06Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.396216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.396364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.396443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.396531 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:50:22.396487691 +0000 UTC m=+119.518353538 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.396604 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.396642 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.396687 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:22.396664676 +0000 UTC m=+119.518530543 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.396846 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:22.39682806 +0000 UTC m=+119.518693747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.458022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.458060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.458072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.458090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.458103 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.498033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.498106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.498152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498256 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498292 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498259 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498309 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498397 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:22.498352792 +0000 UTC m=+119.620218499 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498400 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498442 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498456 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498418 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:22.498407933 +0000 UTC m=+119.620273630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.498536 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:22.498523537 +0000 UTC m=+119.620389224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.561681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.561719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.561729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.561748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.561759 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.606898 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.606953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.607062 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.607005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.606987 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.607266 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.607410 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:06 crc kubenswrapper[4749]: E0310 15:50:06.607565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.664645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.664700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.664717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.664744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.664760 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.767431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.767474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.767483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.767501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.767513 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.870390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.870445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.870456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.870478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.870490 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.973476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.973519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.973530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.973546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:06 crc kubenswrapper[4749]: I0310 15:50:06.973557 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:06Z","lastTransitionTime":"2026-03-10T15:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.077086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.077130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.077148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.077173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.077186 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.138487 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/1.log" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.179228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.179274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.179286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.179302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.179313 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.281941 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.281989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.282000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.282020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.282031 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.384752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.384812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.384832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.384856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.384879 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.487497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.487565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.487580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.487600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.487613 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.589518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.589560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.589570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.589588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.589599 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.607311 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:50:07 crc kubenswrapper[4749]: E0310 15:50:07.607601 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.692709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.692757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.692768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.692792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.692806 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.795567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.795612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.795622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.795640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.795651 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.898292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.898349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.898366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.898402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:07 crc kubenswrapper[4749]: I0310 15:50:07.898419 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:07Z","lastTransitionTime":"2026-03-10T15:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.001616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.001664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.001673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.001723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.001735 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.104110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.104180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.104196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.104214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.104228 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.207315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.207366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.207398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.207416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.207426 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.311312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.311396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.311412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.311432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.311444 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.415117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.415180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.415209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.415234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.415250 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.519579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.519681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.519695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.519718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.519734 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.606420 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.606463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.606543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.606627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:08 crc kubenswrapper[4749]: E0310 15:50:08.606786 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:08 crc kubenswrapper[4749]: E0310 15:50:08.606916 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:08 crc kubenswrapper[4749]: E0310 15:50:08.607146 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:08 crc kubenswrapper[4749]: E0310 15:50:08.607308 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.622356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.622420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.622434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.622458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.622478 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.725036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.725081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.725091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.725109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.725121 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.828010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.828056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.828066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.828088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.828098 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.931046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.931093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.931104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.931122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:08 crc kubenswrapper[4749]: I0310 15:50:08.931133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:08Z","lastTransitionTime":"2026-03-10T15:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.034215 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.034285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.034301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.034332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.034349 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.136285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.136327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.136339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.136357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.136370 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.238861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.238929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.238946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.238970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.238984 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.342328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.342405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.342420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.342442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.342459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.445608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.445697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.445718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.445745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.445767 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.548152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.548198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.548208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.548226 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.548236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.650255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.650324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.650335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.650355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.650395 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.753724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.753814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.753837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.753867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.753889 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.856780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.856831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.856845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.856864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.856876 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.960359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.960419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.960429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.960446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:09 crc kubenswrapper[4749]: I0310 15:50:09.960461 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:09Z","lastTransitionTime":"2026-03-10T15:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.063614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.063726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.063744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.063777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.063801 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.167432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.167482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.167494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.167525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.167538 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.270612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.270663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.270690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.270709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.270730 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.373852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.373900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.373915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.373937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.373951 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.477061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.478066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.478251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.478421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.478616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.581230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.581290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.581302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.581322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.581333 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.606660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.606653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.606814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:10 crc kubenswrapper[4749]: E0310 15:50:10.606948 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.606680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:10 crc kubenswrapper[4749]: E0310 15:50:10.607062 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:10 crc kubenswrapper[4749]: E0310 15:50:10.607205 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:10 crc kubenswrapper[4749]: E0310 15:50:10.607311 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.684522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.684578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.684591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.684612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.684633 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.787641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.787689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.787698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.787716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.787728 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.890103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.890157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.890168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.890186 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.890223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.992168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.992209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.992221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.992241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:10 crc kubenswrapper[4749]: I0310 15:50:10.992255 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:10Z","lastTransitionTime":"2026-03-10T15:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.096534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.097348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.097463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.097561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.097660 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.206058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.206148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.206253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.206291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.206314 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.309472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.309524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.309534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.309554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.309565 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.412304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.412348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.412402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.412420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.412432 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.515339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.515397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.515413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.515432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.515444 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.618350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.618422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.618437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.618451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.618461 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.722424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.722487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.722501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.722517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.722529 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.825449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.825502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.825515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.825537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.825553 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.928643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.928742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.928767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.928797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:11 crc kubenswrapper[4749]: I0310 15:50:11.928817 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:11Z","lastTransitionTime":"2026-03-10T15:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.031823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.031865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.031875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.031893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.031903 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.134875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.135022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.135337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.135365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.135412 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.238712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.238760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.238770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.238787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.238799 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.341518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.341582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.341598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.341623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.341636 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.444998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.445061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.445078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.445096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.445109 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.547765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.547815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.547825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.547842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.547855 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.606527 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.606573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.606667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:12 crc kubenswrapper[4749]: E0310 15:50:12.607123 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.606755 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:12 crc kubenswrapper[4749]: E0310 15:50:12.607731 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:12 crc kubenswrapper[4749]: E0310 15:50:12.607938 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:12 crc kubenswrapper[4749]: E0310 15:50:12.608116 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.620999 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.650945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.650984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.650996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.651012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.651023 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.753863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.753902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.753912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.753933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.753942 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.856838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.856889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.856901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.856921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.856933 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.959943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.960279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.960407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.960522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:12 crc kubenswrapper[4749]: I0310 15:50:12.960624 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:12Z","lastTransitionTime":"2026-03-10T15:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.063727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.063801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.063829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.063863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.063887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.165599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.165635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.165645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.165664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.165677 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.268252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.268311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.268324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.268345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.268357 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.370462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.370530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.370542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.370562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.370579 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.473611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.473661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.473671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.473689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.473704 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.577710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.577750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.577762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.577778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.577789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.625840 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.639856 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.655731 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.672602 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.679815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.679854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.679866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.679885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.679898 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.689571 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.716100 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a29866e618ea4bfd0512a88510f30c16b7e8db0d226e1098d7610c8dabb0a64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"message\\\":\\\"aselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 15:50:02.460785 6543 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.460933 6543 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461248 6543 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461630 6543 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.461723 6543 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 15:50:02.462573 6543 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:50:02.462594 6543 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:50:02.462608 6543 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:02.462641 6543 factory.go:656] Stopping watch factory\\\\nI0310 15:50:02.462660 6543 ovnkube.go:599] Stopped ovnkube\\\\nI0310 15:50:02.462690 6543 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:02.462701 6543 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:50:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\" 6726 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:05.248862 6726 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:50:05.248894 6726 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:05.248935 6726 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:50:05.248941 6726 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:50:05.248940 6726 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:50:05.248977 6726 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:50:05.248981 6726 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:50:05.248998 6726 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:50:05.249038 6726 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:50:05.249055 6726 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:50:05.249064 6726 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:50:05.249072 6726 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:50:05.249118 6726 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:50:05.249124 6726 factory.go:656] Stopping watch factory\\\\nI0310 15:50:05.249132 6726 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.730847 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.746820 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.761157 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.774663 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.782016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.782081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.782097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.782124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.782140 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.794189 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.808770 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.836193 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.852363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.870308 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.882991 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:13Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.884492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.884530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.884544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.884565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.884620 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.988749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.988826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.988845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.988873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:13 crc kubenswrapper[4749]: I0310 15:50:13.988896 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:13Z","lastTransitionTime":"2026-03-10T15:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.092350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.092450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.092468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.092494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.092517 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.196127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.196230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.196242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.196281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.196296 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.299618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.299656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.299666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.299680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.299693 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.403105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.403169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.403182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.403205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.403221 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.507808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.507880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.507893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.507922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.507938 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.606659 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.607309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:14 crc kubenswrapper[4749]: E0310 15:50:14.607525 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.607551 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:14 crc kubenswrapper[4749]: E0310 15:50:14.608281 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:14 crc kubenswrapper[4749]: E0310 15:50:14.608432 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.610587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.610674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.610702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.610742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.610778 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.611942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:14 crc kubenswrapper[4749]: E0310 15:50:14.612313 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.621971 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.717528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.717585 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.717599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.717620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.717634 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.820562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.820599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.820611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.820630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.820642 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.923514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.923972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.924086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.924182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:14 crc kubenswrapper[4749]: I0310 15:50:14.924271 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:14Z","lastTransitionTime":"2026-03-10T15:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.028540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.028580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.028592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.028611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.028622 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.131351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.131431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.131442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.131466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.131478 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.234040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.234090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.234101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.234119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.234129 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.336301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.336344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.336356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.336397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.336410 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.441311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.441389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.441402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.441425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.441439 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.504156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.504213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.504230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.504252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.504269 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: E0310 15:50:15.523096 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.527750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.527796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.527808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.527827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.527840 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: E0310 15:50:15.542651 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.546955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.547001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.547016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.547036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.547049 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: E0310 15:50:15.563173 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.567839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.567917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.567929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.567949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.567961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: E0310 15:50:15.580242 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.588137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.588175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.588187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.588213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.588226 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: E0310 15:50:15.600050 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:15Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:15 crc kubenswrapper[4749]: E0310 15:50:15.600187 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.602173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.602219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.602231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.602249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.602262 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.704668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.704724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.704734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.704754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.704764 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.807769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.807813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.807823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.807839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.807850 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.910329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.910421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.910439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.910462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:15 crc kubenswrapper[4749]: I0310 15:50:15.910477 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:15Z","lastTransitionTime":"2026-03-10T15:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.014002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.014054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.014066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.014086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.014100 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.116678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.116734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.116750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.116770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.116785 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.220324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.220388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.220400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.220420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.220431 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.325032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.325280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.325292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.325311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.325326 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.428794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.428856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.428875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.428896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.428910 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.532676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.532750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.532768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.532800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.532819 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.606266 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.606404 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.606445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:16 crc kubenswrapper[4749]: E0310 15:50:16.606455 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.606498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:16 crc kubenswrapper[4749]: E0310 15:50:16.606584 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:16 crc kubenswrapper[4749]: E0310 15:50:16.606664 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:16 crc kubenswrapper[4749]: E0310 15:50:16.606785 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.635873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.635924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.635939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.635961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.635976 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.739098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.739142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.739152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.739171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.739185 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.842016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.842073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.842083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.842103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.842114 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.944698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.944749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.944758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.944778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:16 crc kubenswrapper[4749]: I0310 15:50:16.944789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:16Z","lastTransitionTime":"2026-03-10T15:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.048127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.048171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.048181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.048200 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.048211 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.151475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.151521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.151536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.151556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.151572 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.254065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.254112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.254124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.254142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.254155 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.357005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.357043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.357053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.357070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.357080 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.460646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.460705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.460718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.460739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.460751 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.563985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.564047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.564064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.564085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.564097 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.672614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.672671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.672684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.672704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.672716 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.775726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.775783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.775795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.775813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.775823 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.878509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.878561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.878573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.878594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.878606 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.982131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.982212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.982238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.982275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:17 crc kubenswrapper[4749]: I0310 15:50:17.982303 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:17Z","lastTransitionTime":"2026-03-10T15:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.085560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.085639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.085664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.085698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.085721 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.188472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.188514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.188524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.188540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.188553 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.292087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.292131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.292141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.292158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.292170 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.395001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.395052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.395063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.395080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.395093 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.497916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.497985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.497999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.498024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.498039 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.601576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.601631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.601644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.601668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.601683 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.605783 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.605809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.605834 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.605785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:18 crc kubenswrapper[4749]: E0310 15:50:18.605963 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:18 crc kubenswrapper[4749]: E0310 15:50:18.606070 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:18 crc kubenswrapper[4749]: E0310 15:50:18.606186 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:18 crc kubenswrapper[4749]: E0310 15:50:18.606585 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.607001 4749 scope.go:117] "RemoveContainer" containerID="98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.633481 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\" 6726 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:05.248862 6726 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:50:05.248894 6726 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:05.248935 6726 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:50:05.248941 6726 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:50:05.248940 6726 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:50:05.248977 6726 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:50:05.248981 6726 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:50:05.248998 6726 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:50:05.249038 6726 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:50:05.249055 6726 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:50:05.249064 6726 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:50:05.249072 6726 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:50:05.249118 6726 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:50:05.249124 6726 factory.go:656] Stopping watch factory\\\\nI0310 15:50:05.249132 6726 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.648608 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.664797 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.680368 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.699202 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.704696 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.704741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.704751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.704772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.704784 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.714042 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.730202 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.747198 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.770325 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.784915 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.802131 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.810740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.810790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.810803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.810867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.810881 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.827557 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.843052 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.858102 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.871106 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.886831 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.899576 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.913490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.913563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.913575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.913594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:18 crc kubenswrapper[4749]: I0310 15:50:18.913606 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:18Z","lastTransitionTime":"2026-03-10T15:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.016564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.016612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.016623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.016643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.016654 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.119796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.119846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.119859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.119876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.119891 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.183178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/1.log" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.186224 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.186759 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.207695 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.222684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.222738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.222749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.222770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.222784 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.225980 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.240628 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.258205 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.275126 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.296956 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.315521 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.325664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.325717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.325727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.325745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.325756 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.333263 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.348936 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.361603 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.380246 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\" 6726 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:05.248862 6726 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:50:05.248894 6726 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:05.248935 6726 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:50:05.248941 6726 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:50:05.248940 6726 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:50:05.248977 6726 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:50:05.248981 6726 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:50:05.248998 6726 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:50:05.249038 6726 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:50:05.249055 6726 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:50:05.249064 6726 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:50:05.249072 6726 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:50:05.249118 6726 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:50:05.249124 6726 factory.go:656] Stopping watch factory\\\\nI0310 15:50:05.249132 6726 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.394458 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.417343 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.428272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.428333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.428359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.428397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.428416 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.434371 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.448729 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.464457 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.484207 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:19Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.531694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.531754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.531763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.531782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.531798 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.634447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.634531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.634554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.634580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.634626 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.737837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.737913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.737923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.737947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.737959 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.841453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.841517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.841527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.841545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.841556 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.945604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.945714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.945736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.945765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:19 crc kubenswrapper[4749]: I0310 15:50:19.945794 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:19Z","lastTransitionTime":"2026-03-10T15:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.048786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.048862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.048886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.048921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.048944 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.151085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.151136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.151146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.151162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.151171 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.192065 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/2.log" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.193132 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/1.log" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.198104 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3" exitCode=1 Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.198174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.198260 4749 scope.go:117] "RemoveContainer" containerID="98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.200242 4749 scope.go:117] "RemoveContainer" containerID="3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3" Mar 10 15:50:20 crc kubenswrapper[4749]: E0310 15:50:20.200880 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.215799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.243890 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98a8b07a29e5cb55f2488a154a3abd436082dff5f9504a6a6084d4a0eaa24caa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:05Z\\\",\\\"message\\\":\\\" 6726 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:50:05.248862 6726 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 15:50:05.248894 6726 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 15:50:05.248935 6726 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:50:05.248941 6726 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:50:05.248940 6726 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:50:05.248977 6726 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:50:05.248981 6726 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 15:50:05.248998 6726 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:50:05.249038 6726 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:50:05.249055 6726 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:50:05.249064 6726 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:50:05.249072 6726 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 15:50:05.249118 6726 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:50:05.249124 6726 factory.go:656] Stopping watch factory\\\\nI0310 15:50:05.249132 6726 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.253539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.253603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.253616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.253639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.253654 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.280644 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.310195 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.333781 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.354114 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.356445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.356491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.356505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.356523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.356538 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.366016 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.378111 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.399077 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.415068 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.429705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.445782 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.459260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.459318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.459331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.459352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.459365 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.461812 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.480485 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.495481 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.509519 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.524277 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:20Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.562937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.562987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.563001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.563022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.563042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.606694 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:20 crc kubenswrapper[4749]: E0310 15:50:20.607119 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.607192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.607154 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:20 crc kubenswrapper[4749]: E0310 15:50:20.607482 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:20 crc kubenswrapper[4749]: E0310 15:50:20.607354 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.606736 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:20 crc kubenswrapper[4749]: E0310 15:50:20.607578 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.665503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.665829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.665901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.665978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.666066 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.769138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.769187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.769202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.769218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.769230 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.871961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.871994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.872003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.872019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.872028 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.974716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.974767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.974778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.974794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:20 crc kubenswrapper[4749]: I0310 15:50:20.974804 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:20Z","lastTransitionTime":"2026-03-10T15:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.078521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.078919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.078938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.078962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.078981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.182638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.182676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.182687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.182705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.182720 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.205462 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/2.log" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.209234 4749 scope.go:117] "RemoveContainer" containerID="3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3" Mar 10 15:50:21 crc kubenswrapper[4749]: E0310 15:50:21.209413 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.231665 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.247657 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.264891 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.281282 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.285642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.285682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.285692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.285708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.285719 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.295956 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.311280 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.327140 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.341053 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.355030 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.375791 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.388282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.388326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.388337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.388361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.388388 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.388462 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.405089 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.421263 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.438595 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.457015 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.471926 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.491366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.491432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.491444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.491465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.491478 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.498339 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:21Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.594045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.594098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.594114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.594136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.594150 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.607936 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.620323 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.704147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.704408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.704431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.704498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.704542 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.807951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.807990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.807999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.808015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.808028 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.911431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.911505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.911520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.911539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:21 crc kubenswrapper[4749]: I0310 15:50:21.911554 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:21Z","lastTransitionTime":"2026-03-10T15:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.015547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.015635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.015659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.015689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.015709 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.118613 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.118666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.118683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.118703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.118719 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.216151 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.218516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.220949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.220993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.221017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.221038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.221051 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.231342 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.250851 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.268669 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.284126 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.297057 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.317692 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.323597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.323655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.323671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.323694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.323709 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.332424 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.346884 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.364176 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.388247 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.405709 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.420764 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.426651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.426690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.426704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.426722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.426733 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.435393 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.451102 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.465069 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.478090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.478204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.478243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.478358 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:50:54.478319283 +0000 UTC m=+151.600184970 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.478365 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.478430 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.478469 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:54.478457577 +0000 UTC m=+151.600323264 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.478492 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:54.478482467 +0000 UTC m=+151.600348364 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.478812 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.491365 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.505183 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:22Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.529901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.529950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.529962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.529986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.530002 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.578934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.579003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.579041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579121 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579146 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579150 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579170 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579213 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:54.579193487 +0000 UTC m=+151.701059174 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579230 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:54.579222967 +0000 UTC m=+151.701088654 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579288 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579331 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579347 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.579458 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:50:54.579435093 +0000 UTC m=+151.701300950 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.605860 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.605869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.605888 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.606023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.606184 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.606363 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.606450 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:22 crc kubenswrapper[4749]: E0310 15:50:22.606517 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.632792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.632835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.632845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.632863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.632877 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.735642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.735701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.735712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.735742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.735759 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.838738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.838840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.838858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.838878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.838891 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.941166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.941207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.941219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.941241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:22 crc kubenswrapper[4749]: I0310 15:50:22.941252 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:22Z","lastTransitionTime":"2026-03-10T15:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.045052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.045113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.045125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.045148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.045159 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:23Z","lastTransitionTime":"2026-03-10T15:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.147740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.147804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.147821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.147839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.147851 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:23Z","lastTransitionTime":"2026-03-10T15:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.250328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.250635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.250744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.250903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.250965 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:23Z","lastTransitionTime":"2026-03-10T15:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.358984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.359071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.359087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.359108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.359118 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:23Z","lastTransitionTime":"2026-03-10T15:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.462100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.462161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.462171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.462190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.462202 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:23Z","lastTransitionTime":"2026-03-10T15:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:23 crc kubenswrapper[4749]: E0310 15:50:23.563031 4749 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.622834 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.643323 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.658027 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.675925 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: E0310 15:50:23.684794 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.695569 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.712355 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.731307 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.746536 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.766484 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.785311 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.811777 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.831118 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.850261 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.868715 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.884900 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.902606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.917979 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:23 crc kubenswrapper[4749]: I0310 15:50:23.934926 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:23Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:24 crc kubenswrapper[4749]: I0310 15:50:24.606228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:24 crc kubenswrapper[4749]: I0310 15:50:24.606299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:24 crc kubenswrapper[4749]: E0310 15:50:24.606420 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:24 crc kubenswrapper[4749]: I0310 15:50:24.606498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:24 crc kubenswrapper[4749]: E0310 15:50:24.606562 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:24 crc kubenswrapper[4749]: E0310 15:50:24.606692 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:24 crc kubenswrapper[4749]: I0310 15:50:24.607129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:24 crc kubenswrapper[4749]: E0310 15:50:24.607227 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.968066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.968361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.968370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.968410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.968420 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:25Z","lastTransitionTime":"2026-03-10T15:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:25 crc kubenswrapper[4749]: E0310 15:50:25.989324 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:25Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.995231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.995294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.995308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.995324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:25 crc kubenswrapper[4749]: I0310 15:50:25.995335 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:25Z","lastTransitionTime":"2026-03-10T15:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.009403 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.015656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.015718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.015732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.015749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.015762 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:26Z","lastTransitionTime":"2026-03-10T15:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.033658 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.038266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.038304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.038314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.038330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.038340 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:26Z","lastTransitionTime":"2026-03-10T15:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.051493 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.055621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.055664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.055678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.055696 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.055706 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:26Z","lastTransitionTime":"2026-03-10T15:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.067293 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.067457 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.432267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.606153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.606324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.606410 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.606187 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.606568 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:26 crc kubenswrapper[4749]: I0310 15:50:26.606434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.606738 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:26 crc kubenswrapper[4749]: E0310 15:50:26.606808 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:28 crc kubenswrapper[4749]: I0310 15:50:28.606291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:28 crc kubenswrapper[4749]: I0310 15:50:28.606334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:28 crc kubenswrapper[4749]: I0310 15:50:28.606508 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:28 crc kubenswrapper[4749]: E0310 15:50:28.606624 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:28 crc kubenswrapper[4749]: E0310 15:50:28.606744 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:28 crc kubenswrapper[4749]: E0310 15:50:28.606918 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:28 crc kubenswrapper[4749]: I0310 15:50:28.606933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:28 crc kubenswrapper[4749]: E0310 15:50:28.607081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:28 crc kubenswrapper[4749]: E0310 15:50:28.686566 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:30 crc kubenswrapper[4749]: I0310 15:50:30.605709 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:30 crc kubenswrapper[4749]: I0310 15:50:30.605856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:30 crc kubenswrapper[4749]: E0310 15:50:30.605926 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:30 crc kubenswrapper[4749]: I0310 15:50:30.605728 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:30 crc kubenswrapper[4749]: I0310 15:50:30.605729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:30 crc kubenswrapper[4749]: E0310 15:50:30.606075 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:30 crc kubenswrapper[4749]: E0310 15:50:30.606336 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:30 crc kubenswrapper[4749]: E0310 15:50:30.606585 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:32 crc kubenswrapper[4749]: I0310 15:50:32.606570 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:32 crc kubenswrapper[4749]: I0310 15:50:32.606699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:32 crc kubenswrapper[4749]: E0310 15:50:32.606782 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:32 crc kubenswrapper[4749]: I0310 15:50:32.606863 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:32 crc kubenswrapper[4749]: E0310 15:50:32.606923 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:32 crc kubenswrapper[4749]: I0310 15:50:32.606972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:32 crc kubenswrapper[4749]: E0310 15:50:32.607132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:32 crc kubenswrapper[4749]: E0310 15:50:32.607196 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.606901 4749 scope.go:117] "RemoveContainer" containerID="3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3" Mar 10 15:50:33 crc kubenswrapper[4749]: E0310 15:50:33.607067 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.624312 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.645850 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.661719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.683565 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: E0310 15:50:33.687185 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.700698 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.714531 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.729885 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.746008 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.770657 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.786763 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.797983 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.813286 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.830002 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.844778 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.858074 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.872103 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.886431 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:33 crc kubenswrapper[4749]: I0310 15:50:33.898832 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:33Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:34 crc kubenswrapper[4749]: I0310 15:50:34.606104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:34 crc kubenswrapper[4749]: I0310 15:50:34.606144 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:34 crc kubenswrapper[4749]: I0310 15:50:34.606166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:34 crc kubenswrapper[4749]: I0310 15:50:34.606144 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:34 crc kubenswrapper[4749]: E0310 15:50:34.606252 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:34 crc kubenswrapper[4749]: E0310 15:50:34.606529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:34 crc kubenswrapper[4749]: E0310 15:50:34.606655 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:34 crc kubenswrapper[4749]: E0310 15:50:34.606782 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.312139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.312180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.312191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.312209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.312223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:36Z","lastTransitionTime":"2026-03-10T15:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.325803 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.329300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.329359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.329385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.329406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.329419 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:36Z","lastTransitionTime":"2026-03-10T15:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.341003 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.344763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.344809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.344820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.344842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.344855 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:36Z","lastTransitionTime":"2026-03-10T15:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.360838 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.365165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.365242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.365257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.365285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.365302 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:36Z","lastTransitionTime":"2026-03-10T15:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.382725 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.386656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.386729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.386745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.386769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.386783 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:36Z","lastTransitionTime":"2026-03-10T15:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.402046 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.402238 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.438873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.449583 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.467369 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.477526 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.489497 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.504923 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.518109 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.532344 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.545583 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.556890 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.569293 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.583868 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.601858 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.606571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.606599 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.606581 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.606685 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.606826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.607009 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.607173 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:36 crc kubenswrapper[4749]: E0310 15:50:36.607278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.616503 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.631684 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.644533 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.656943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.668599 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:36 crc kubenswrapper[4749]: I0310 15:50:36.683020 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:38 crc kubenswrapper[4749]: I0310 15:50:38.606468 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:38 crc kubenswrapper[4749]: I0310 15:50:38.606584 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:38 crc kubenswrapper[4749]: E0310 15:50:38.606712 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:38 crc kubenswrapper[4749]: I0310 15:50:38.606755 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:38 crc kubenswrapper[4749]: E0310 15:50:38.607049 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:38 crc kubenswrapper[4749]: I0310 15:50:38.607064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:38 crc kubenswrapper[4749]: E0310 15:50:38.607157 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:38 crc kubenswrapper[4749]: E0310 15:50:38.607425 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:38 crc kubenswrapper[4749]: I0310 15:50:38.622210 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 15:50:38 crc kubenswrapper[4749]: E0310 15:50:38.689078 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.279255 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/0.log" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.279317 4749 generic.go:334] "Generic (PLEG): container finished" podID="807d12f5-c95a-4a7e-91c5-128de3d2235c" containerID="78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518" exitCode=1 Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.279408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerDied","Data":"78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518"} Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.280225 4749 scope.go:117] "RemoveContainer" containerID="78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.307411 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.324442 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.339552 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.355672 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.374678 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.387843 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.403585 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.418244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.432057 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.452639 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.468243 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.486089 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.500466 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.514232 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.529755 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.545280 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.556812 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.571058 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.586834 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:40Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.606058 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.606107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.606100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:40 crc kubenswrapper[4749]: I0310 15:50:40.606084 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:40 crc kubenswrapper[4749]: E0310 15:50:40.606215 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:40 crc kubenswrapper[4749]: E0310 15:50:40.606328 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:40 crc kubenswrapper[4749]: E0310 15:50:40.606353 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:40 crc kubenswrapper[4749]: E0310 15:50:40.606430 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.284559 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/0.log" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.284628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerStarted","Data":"5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd"} Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.298929 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.310773 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.323514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.333918 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.356952 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.368139 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.379648 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.391667 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.403361 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.414139 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.423719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.431525 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.448563 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.457187 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.466057 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.480640 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.492011 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.504249 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:41 crc kubenswrapper[4749]: I0310 15:50:41.519825 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:41Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:42 crc kubenswrapper[4749]: I0310 15:50:42.606776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:42 crc kubenswrapper[4749]: I0310 15:50:42.606888 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:42 crc kubenswrapper[4749]: I0310 15:50:42.606776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:42 crc kubenswrapper[4749]: E0310 15:50:42.606945 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:42 crc kubenswrapper[4749]: E0310 15:50:42.607153 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:42 crc kubenswrapper[4749]: E0310 15:50:42.607196 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:42 crc kubenswrapper[4749]: I0310 15:50:42.607436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:42 crc kubenswrapper[4749]: E0310 15:50:42.607594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.622274 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.643799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.658950 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.674130 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: E0310 15:50:43.690002 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.695229 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.712764 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.736177 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.754811 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.770853 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.790019 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.807496 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.830863 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.848176 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.863137 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.882477 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.903465 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.939639 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.955719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:43 crc kubenswrapper[4749]: I0310 15:50:43.971766 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:43Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:44 crc kubenswrapper[4749]: I0310 15:50:44.606463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:44 crc kubenswrapper[4749]: I0310 15:50:44.606580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:44 crc kubenswrapper[4749]: I0310 15:50:44.606575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:44 crc kubenswrapper[4749]: I0310 15:50:44.606514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:44 crc kubenswrapper[4749]: E0310 15:50:44.606726 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:44 crc kubenswrapper[4749]: E0310 15:50:44.606881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:44 crc kubenswrapper[4749]: E0310 15:50:44.606973 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:44 crc kubenswrapper[4749]: E0310 15:50:44.607505 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.453174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.453228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.453239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.453255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.453267 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:46Z","lastTransitionTime":"2026-03-10T15:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.467509 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.473118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.473169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.473180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.473199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.473210 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:46Z","lastTransitionTime":"2026-03-10T15:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.488646 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.493963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.494012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.494022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.494041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.494053 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:46Z","lastTransitionTime":"2026-03-10T15:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.508359 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.513907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.513951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.513963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.513981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.513992 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:46Z","lastTransitionTime":"2026-03-10T15:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.531273 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.535788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.535846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.535859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.535879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.535892 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:46Z","lastTransitionTime":"2026-03-10T15:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.551983 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"baf5cb54-c273-4495-b7cb-c1fd4f825d5e\\\",\\\"systemUUID\\\":\\\"daf2981f-1789-4491-b9fa-78a944145505\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.552146 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.606405 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.606405 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.606435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:46 crc kubenswrapper[4749]: I0310 15:50:46.606593 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.606741 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.606941 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.607193 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:46 crc kubenswrapper[4749]: E0310 15:50:46.607450 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:47 crc kubenswrapper[4749]: I0310 15:50:47.607190 4749 scope.go:117] "RemoveContainer" containerID="3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.311344 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/2.log" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.314268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.315009 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.332362 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.348943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.363472 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.380522 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.396087 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.415308 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.429511 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.446104 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.459032 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.475220 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.486252 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.499701 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.517475 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.532713 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.558785 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.574966 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.589097 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.603116 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.606199 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.606229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.606243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:48 crc kubenswrapper[4749]: E0310 15:50:48.606326 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:48 crc kubenswrapper[4749]: E0310 15:50:48.606402 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:48 crc kubenswrapper[4749]: E0310 15:50:48.606481 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.606597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:48 crc kubenswrapper[4749]: E0310 15:50:48.606752 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:48 crc kubenswrapper[4749]: I0310 15:50:48.617598 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:48 crc kubenswrapper[4749]: E0310 15:50:48.691345 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.320905 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/3.log" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.321712 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/2.log" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.324815 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" exitCode=1 Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.324899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.325002 4749 scope.go:117] "RemoveContainer" containerID="3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.325598 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 15:50:49 crc kubenswrapper[4749]: E0310 15:50:49.325843 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.341488 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.361339 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.375480 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.390211 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.406794 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.419424 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.441770 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3da09a5cfa3dc5ad8df3d353346597455bc45697086ba775eea109abb5fbc4e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:19Z\\\",\\\"message\\\":\\\"72c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:50:19.424815 6922 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 15:50:19.424884 6922 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:48Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.168\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 15:50:48.432722 7280 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI0310 15:50:48.432729 7280 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator for network=default\\\\nI0310 15:50:48.432736 7280 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 3.065426ms\\\\nI0310 15:50:48.432750 7280 services_controller.go:356] Processing sync for service openshift-kube-scheduler-operator/metrics for network=default\\\\nF0310 15:50:48.432615 7280 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.459635 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.474275 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.488705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.506257 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.525757 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.541074 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.557621 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.575742 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.593695 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.611547 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.624834 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:49 crc kubenswrapper[4749]: I0310 15:50:49.648197 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.330222 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/3.log" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.334840 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 15:50:50 crc kubenswrapper[4749]: E0310 15:50:50.335038 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.349173 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.364713 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.377974 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.390085 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.405658 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.417934 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.436167 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:48Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.168\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 15:50:48.432722 7280 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI0310 15:50:48.432729 7280 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator for network=default\\\\nI0310 15:50:48.432736 7280 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 3.065426ms\\\\nI0310 15:50:48.432750 7280 services_controller.go:356] Processing sync for service openshift-kube-scheduler-operator/metrics for network=default\\\\nF0310 15:50:48.432615 7280 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.449734 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.462682 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.478105 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.496549 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.507989 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.519971 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.533072 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.544993 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.568705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.586416 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.606705 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.606705 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.606868 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:50 crc kubenswrapper[4749]: E0310 15:50:50.606899 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.606732 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:50 crc kubenswrapper[4749]: E0310 15:50:50.607022 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:50 crc kubenswrapper[4749]: E0310 15:50:50.607172 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:50 crc kubenswrapper[4749]: E0310 15:50:50.607284 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.611424 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:50 crc kubenswrapper[4749]: I0310 15:50:50.626343 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:52 crc kubenswrapper[4749]: I0310 15:50:52.606020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:52 crc kubenswrapper[4749]: I0310 15:50:52.606101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:52 crc kubenswrapper[4749]: E0310 15:50:52.607225 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:52 crc kubenswrapper[4749]: I0310 15:50:52.606117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:52 crc kubenswrapper[4749]: I0310 15:50:52.606155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:52 crc kubenswrapper[4749]: E0310 15:50:52.607338 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:52 crc kubenswrapper[4749]: E0310 15:50:52.607419 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:52 crc kubenswrapper[4749]: E0310 15:50:52.607242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.617351 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07b78914-24ae-4dc3-a640-23ade3cb9d39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b564b43721785f1063eb21a453ceef9f141d7ee0a9b94a2ee62085eb413ee8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2a49548463e6eee4580cc3b40482068a40ee203c4e5e585d7d92141bd1c8d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x88x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pp7d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.633089 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0636061-098d-4b79-b24d-ae0e070c8b17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f79adf5e82468dfd6b91f8fcd08c9909bc06fa7181763514430cf2f025ad1cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bd1b75d487b0b8177036919f4b519f266c8b49d787a309f3b7ac77ede7f503\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51624bb4cc89af1be00f5943b661c02e33c9ea44f3d71f61e67977991d9fa49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1037c5e8f3ac0d50849c5199ca0460af0f179b508d56d946dec2a1eba741748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21ff30d1bc5a019b0ce703c5f4e3a0753badf0724073492a044a15e8ff79952c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1bd62bedaf9138ab78e14d71a5e1d843c49a8ee87439dab0d888cb82046edf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2abf88832f67fe51958e84570d29de101456a6b7f47857fea22771b5330af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:50:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zktgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tp7tp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.645392 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j4tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f483ba5-0e39-43a8-b651-9db5308235d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb0fec807f99d3fa5b71f96edff9b0584ff15a0ab9763d0f374894350248c44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nwnzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j4tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.658714 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebcbc0fc-15f3-4e4e-ae14-832adec8da50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://156d3760300c6525d06a10a38a6552fb013205a32f263529fc56b5f2d834ebb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qk2c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7rts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.675667 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c4083e1-ef1a-4d75-9c21-20e180f6a5e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:49:36Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 15:49:36.134876 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 15:49:36.135062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:49:36.135947 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1694565957/tls.crt::/tmp/serving-cert-1694565957/tls.key\\\\\\\"\\\\nI0310 15:49:36.743509 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 15:49:36.746001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 15:49:36.746031 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 15:49:36.746059 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 15:49:36.746070 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 15:49:36.750822 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 15:49:36.750929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750964 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 15:49:36.750991 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0310 15:49:36.750842 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 15:49:36.751018 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 15:49:36.751151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 15:49:36.751187 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 15:49:36.753015 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: E0310 15:50:53.692054 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.693551 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.716739 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e81b87d4-287a-4a46-82e5-a8e5411816b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8211657f6ea2b7d22444118f42f5c09cc53de5a5a2f43f8ac670492ff975f358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6ee97d51ccc5b79e6f62e6d215538d3595fa52bc03c4af716cadef7e51a5ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a7b48171a62e7cc19df2d8ec579a66a76013eb5a772c35a6c539a53655baea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bc59d3a9ad5bc88c6fce3cae9f66d65ce2ed3197ee7b12f30afb51d5e56d5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d94e596b92dda62c034a54ed315477169d3ea83fd0d3c3435cad9d762debff7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c46b62ddc92ca160a5f5375cf492edb575f719b8e6723b7f89152d8a93e8d76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9508c2bd27fb33e9be68413dbb379c0350d76d9ae692018090ef1d09555297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d89644afcb6b5955ff86663fbf3ee968de8e300f110f96d545c41b3ce85788b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.733699 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2924576ded6519950b33714eaf7b144cea3795b714aa9fa46cb0532c3e8c6834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.747830 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1d74b29efb0ec0b31e07bd573117566857b623b8e014be52d1b0eff6373f9d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.762816 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.775327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"079ee3fb-8259-47be-bfda-ac3dd5cc5e2d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2ce914fe09be8e266078e486a824c15113758204e331d204314552889a176cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6520e37fbb486b4fe9ee982e761c8153efff9e4f9a136c0492860dd440ebbb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f728cec110ab3738832921769ef26fb951632b05de4c3fb1f04fbccc152c8df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84dcfcc929e56275a77a9898decc2981fc9856375200ea71c39cfd54b96005c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.790529 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d5cf962-372f-43ba-8783-a38009ffb8a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5201d42ee17cfddadb04e77decb575ee63afeab3f0f1dac0ea675763a694a71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c5df3774620903b3549b5a92b729968835f7f1373703f1847c7925c0c37cf5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:48:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.807195 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ce2c00c80610c82c8f9f495893e41565229d051b673828f2d65963d6b923082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac13d9b8e5fa63ac582df8c477dc7bc3fa670613b3ab8ec1017a43caec016e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.822765 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd3985af-f2c3-4f91-919e-2ea9420418b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nz5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpmqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.842179 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gwpmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"807d12f5-c95a-4a7e-91c5-128de3d2235c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:50:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:39Z\\\",\\\"message\\\":\\\"2026-03-10T15:49:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a\\\\n2026-03-10T15:49:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a368ca98-5f12-487a-8c75-5c73b4a42f4a to /host/opt/cni/bin/\\\\n2026-03-10T15:49:54Z [verbose] multus-daemon started\\\\n2026-03-10T15:49:54Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:50:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:50:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sj9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gwpmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.860024 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c229aa9-e4f2-4aa7-a6a1-c89dfc7ff4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8157d9132bdf76a4da721942933b9e00310617d1140703056d6872f621ca7dde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc5a02e1f193cc73e422620583a9defb04405be325806cdad98188d53f79e84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:48:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:48:25.594214 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:48:25.596690 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:48:25.623893 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:48:25.628524 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0310 15:48:55.785323 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65e48472c2ad166378341e74919dd236b987d3053355b5555f682b36549a8016\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42574e7497ac462461fca8e00c7b1de706f4639c6acb0aa68595342e2fc96ca5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:48:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:48:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.875869 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.890284 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r8l57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eadca31d-151b-4569-8c6f-71ce4a6f0d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f277a8aa75d447a51d00b420401802d74817187ab1daf06f6c882eef34c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bq7ml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r8l57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:53 crc kubenswrapper[4749]: I0310 15:50:53.921729 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:49:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:50:48Z\\\",\\\"message\\\":\\\"r\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.168\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 15:50:48.432722 7280 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nI0310 15:50:48.432729 7280 services_controller.go:356] Processing sync for service openshift-cluster-version/cluster-version-operator for network=default\\\\nI0310 15:50:48.432736 7280 services_controller.go:360] Finished syncing service console on namespace openshift-console for network=default : 3.065426ms\\\\nI0310 15:50:48.432750 7280 services_controller.go:356] Processing sync for service openshift-kube-scheduler-operator/metrics for network=default\\\\nF0310 15:50:48.432615 7280 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:50:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:49:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:49:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:49:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8gk7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:49:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nvpsq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:50:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.541991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.542132 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.542106943 +0000 UTC m=+215.663972640 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.542184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.542231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.542360 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.542476 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.542458644 +0000 UTC m=+215.664324351 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.542737 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.542817 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.542799794 +0000 UTC m=+215.664665491 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.606258 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.606286 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.606252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.606421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.606542 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.606747 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.606759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.606800 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.643954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.644235 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.644669 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.644688 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.644775 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.644741464 +0000 UTC m=+215.766607151 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.645057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.645164 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.645260 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.645268 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:54 crc kubenswrapper[4749]: I0310 15:50:54.645420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.645533 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.645565 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs podName:cd3985af-f2c3-4f91-919e-2ea9420418b3 nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.64555568 +0000 UTC m=+215.767421367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs") pod "network-metrics-daemon-jpmqp" (UID: "cd3985af-f2c3-4f91-919e-2ea9420418b3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:50:54 crc kubenswrapper[4749]: E0310 15:50:54.645679 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.645618762 +0000 UTC m=+215.767484509 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.606488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.606519 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.606519 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.606689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:56 crc kubenswrapper[4749]: E0310 15:50:56.606934 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:56 crc kubenswrapper[4749]: E0310 15:50:56.607261 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:56 crc kubenswrapper[4749]: E0310 15:50:56.607324 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:56 crc kubenswrapper[4749]: E0310 15:50:56.607432 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.688345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.688445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.688499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.688535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.688568 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:50:56Z","lastTransitionTime":"2026-03-10T15:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.748369 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq"] Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.748947 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.751678 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.751743 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.751954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.752287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.766022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efe15e38-057e-46c2-b6a5-8da8d977857a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.766083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efe15e38-057e-46c2-b6a5-8da8d977857a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.766133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/efe15e38-057e-46c2-b6a5-8da8d977857a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.766188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/efe15e38-057e-46c2-b6a5-8da8d977857a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.766225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe15e38-057e-46c2-b6a5-8da8d977857a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.785037 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=63.785008851 podStartE2EDuration="1m3.785008851s" podCreationTimestamp="2026-03-10 15:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.770561782 +0000 UTC m=+153.892427469" watchObservedRunningTime="2026-03-10 15:50:56.785008851 +0000 UTC m=+153.906874538" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.822946 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pp7d7" podStartSLOduration=115.82291942 podStartE2EDuration="1m55.82291942s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.799542473 +0000 UTC m=+153.921408170" watchObservedRunningTime="2026-03-10 15:50:56.82291942 +0000 UTC m=+153.944785107" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.849751 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j4tr6" podStartSLOduration=115.849726513 podStartE2EDuration="1m55.849726513s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.849636591 +0000 UTC m=+153.971502298" watchObservedRunningTime="2026-03-10 15:50:56.849726513 +0000 UTC m=+153.971592200" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.850031 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tp7tp" podStartSLOduration=115.850025292 podStartE2EDuration="1m55.850025292s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.83449263 +0000 UTC m=+153.956358327" watchObservedRunningTime="2026-03-10 15:50:56.850025292 +0000 UTC m=+153.971890979" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.864030 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podStartSLOduration=115.864004827 podStartE2EDuration="1m55.864004827s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.863881523 +0000 UTC m=+153.985747220" watchObservedRunningTime="2026-03-10 15:50:56.864004827 +0000 UTC m=+153.985870514" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/efe15e38-057e-46c2-b6a5-8da8d977857a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe15e38-057e-46c2-b6a5-8da8d977857a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efe15e38-057e-46c2-b6a5-8da8d977857a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/efe15e38-057e-46c2-b6a5-8da8d977857a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efe15e38-057e-46c2-b6a5-8da8d977857a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/efe15e38-057e-46c2-b6a5-8da8d977857a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.868991 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/efe15e38-057e-46c2-b6a5-8da8d977857a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.869984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe15e38-057e-46c2-b6a5-8da8d977857a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.881428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efe15e38-057e-46c2-b6a5-8da8d977857a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.888330 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efe15e38-057e-46c2-b6a5-8da8d977857a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7hjwq\" (UID: \"efe15e38-057e-46c2-b6a5-8da8d977857a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.893951 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.893926908 podStartE2EDuration="35.893926908s" podCreationTimestamp="2026-03-10 15:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.881538473 +0000 UTC m=+154.003404160" watchObservedRunningTime="2026-03-10 15:50:56.893926908 +0000 UTC m=+154.015792595" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.925782 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.925758077 podStartE2EDuration="18.925758077s" podCreationTimestamp="2026-03-10 15:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.894408153 +0000 UTC m=+154.016273840" watchObservedRunningTime="2026-03-10 15:50:56.925758077 +0000 UTC m=+154.047623774" Mar 10 15:50:56 crc kubenswrapper[4749]: I0310 15:50:56.945241 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=44.945212173 podStartE2EDuration="44.945212173s" podCreationTimestamp="2026-03-10 15:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.925581752 +0000 UTC m=+154.047447439" watchObservedRunningTime="2026-03-10 15:50:56.945212173 +0000 UTC m=+154.067077860" Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.018016 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=43.017992636 podStartE2EDuration="43.017992636s" podCreationTimestamp="2026-03-10 15:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:56.999352217 +0000 UTC m=+154.121217904" watchObservedRunningTime="2026-03-10 15:50:57.017992636 +0000 UTC m=+154.139858323" Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.063881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.088414 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gwpmf" podStartSLOduration=116.088388376 podStartE2EDuration="1m56.088388376s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:57.067155675 +0000 UTC m=+154.189021382" watchObservedRunningTime="2026-03-10 15:50:57.088388376 +0000 UTC m=+154.210254063" Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.105471 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r8l57" podStartSLOduration=116.105452216 podStartE2EDuration="1m56.105452216s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:57.104547718 +0000 UTC m=+154.226413405" watchObservedRunningTime="2026-03-10 15:50:57.105452216 +0000 UTC m=+154.227317903" Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.359748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" event={"ID":"efe15e38-057e-46c2-b6a5-8da8d977857a","Type":"ContainerStarted","Data":"8d483e7f5f694ec28f47150c880ad53613500da210f0de452749657a840f59f3"} Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.359808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" event={"ID":"efe15e38-057e-46c2-b6a5-8da8d977857a","Type":"ContainerStarted","Data":"82b608f840fa4b0f6a23195eba458e05cae76763dcb6335c758223b759061c05"} Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.383987 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7hjwq" podStartSLOduration=116.383955947 podStartE2EDuration="1m56.383955947s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:50:57.381645275 +0000 UTC m=+154.503510972" watchObservedRunningTime="2026-03-10 15:50:57.383955947 +0000 UTC m=+154.505821664" Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.646705 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 15:50:57 crc kubenswrapper[4749]: I0310 15:50:57.658249 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 15:50:58 crc kubenswrapper[4749]: I0310 15:50:58.605900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:50:58 crc kubenswrapper[4749]: E0310 15:50:58.606047 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:50:58 crc kubenswrapper[4749]: I0310 15:50:58.606105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:50:58 crc kubenswrapper[4749]: I0310 15:50:58.606226 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:50:58 crc kubenswrapper[4749]: E0310 15:50:58.606258 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:50:58 crc kubenswrapper[4749]: E0310 15:50:58.606439 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:50:58 crc kubenswrapper[4749]: I0310 15:50:58.606437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:50:58 crc kubenswrapper[4749]: E0310 15:50:58.606543 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:50:58 crc kubenswrapper[4749]: E0310 15:50:58.693084 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:00 crc kubenswrapper[4749]: I0310 15:51:00.606494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:00 crc kubenswrapper[4749]: I0310 15:51:00.606531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:00 crc kubenswrapper[4749]: I0310 15:51:00.606563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:00 crc kubenswrapper[4749]: I0310 15:51:00.606563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:00 crc kubenswrapper[4749]: E0310 15:51:00.606657 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:00 crc kubenswrapper[4749]: E0310 15:51:00.606728 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:00 crc kubenswrapper[4749]: E0310 15:51:00.606832 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:00 crc kubenswrapper[4749]: E0310 15:51:00.606894 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:02 crc kubenswrapper[4749]: I0310 15:51:02.606111 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:02 crc kubenswrapper[4749]: I0310 15:51:02.606159 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:02 crc kubenswrapper[4749]: I0310 15:51:02.606217 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:02 crc kubenswrapper[4749]: I0310 15:51:02.606259 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:02 crc kubenswrapper[4749]: E0310 15:51:02.606418 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:02 crc kubenswrapper[4749]: E0310 15:51:02.606776 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:02 crc kubenswrapper[4749]: E0310 15:51:02.606897 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:02 crc kubenswrapper[4749]: E0310 15:51:02.606951 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:03 crc kubenswrapper[4749]: E0310 15:51:03.693505 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:04 crc kubenswrapper[4749]: I0310 15:51:04.606214 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:04 crc kubenswrapper[4749]: I0310 15:51:04.606687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:04 crc kubenswrapper[4749]: I0310 15:51:04.606844 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:04 crc kubenswrapper[4749]: I0310 15:51:04.606918 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 15:51:04 crc kubenswrapper[4749]: E0310 15:51:04.606991 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:04 crc kubenswrapper[4749]: E0310 15:51:04.607061 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:51:04 crc kubenswrapper[4749]: I0310 15:51:04.607187 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:04 crc kubenswrapper[4749]: E0310 15:51:04.607296 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:04 crc kubenswrapper[4749]: E0310 15:51:04.607474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:04 crc kubenswrapper[4749]: E0310 15:51:04.607599 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:06 crc kubenswrapper[4749]: I0310 15:51:06.606474 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:06 crc kubenswrapper[4749]: I0310 15:51:06.606553 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:06 crc kubenswrapper[4749]: I0310 15:51:06.606559 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:06 crc kubenswrapper[4749]: I0310 15:51:06.606631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:06 crc kubenswrapper[4749]: E0310 15:51:06.606642 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:06 crc kubenswrapper[4749]: E0310 15:51:06.606782 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:06 crc kubenswrapper[4749]: E0310 15:51:06.606857 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:06 crc kubenswrapper[4749]: E0310 15:51:06.606895 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:08 crc kubenswrapper[4749]: I0310 15:51:08.606586 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:08 crc kubenswrapper[4749]: I0310 15:51:08.606586 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:08 crc kubenswrapper[4749]: E0310 15:51:08.607296 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:08 crc kubenswrapper[4749]: I0310 15:51:08.606861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:08 crc kubenswrapper[4749]: E0310 15:51:08.607460 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:08 crc kubenswrapper[4749]: I0310 15:51:08.606778 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:08 crc kubenswrapper[4749]: E0310 15:51:08.607081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:08 crc kubenswrapper[4749]: E0310 15:51:08.607556 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:08 crc kubenswrapper[4749]: E0310 15:51:08.695075 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:10 crc kubenswrapper[4749]: I0310 15:51:10.606324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:10 crc kubenswrapper[4749]: I0310 15:51:10.606324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:10 crc kubenswrapper[4749]: E0310 15:51:10.606521 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:10 crc kubenswrapper[4749]: I0310 15:51:10.606367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:10 crc kubenswrapper[4749]: E0310 15:51:10.606595 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:10 crc kubenswrapper[4749]: I0310 15:51:10.606356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:10 crc kubenswrapper[4749]: E0310 15:51:10.606678 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:10 crc kubenswrapper[4749]: E0310 15:51:10.606842 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:12 crc kubenswrapper[4749]: I0310 15:51:12.606693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:12 crc kubenswrapper[4749]: I0310 15:51:12.606771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:12 crc kubenswrapper[4749]: E0310 15:51:12.606868 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:12 crc kubenswrapper[4749]: E0310 15:51:12.606953 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:12 crc kubenswrapper[4749]: I0310 15:51:12.607072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:12 crc kubenswrapper[4749]: I0310 15:51:12.607149 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:12 crc kubenswrapper[4749]: E0310 15:51:12.607325 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:12 crc kubenswrapper[4749]: E0310 15:51:12.607509 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:13 crc kubenswrapper[4749]: E0310 15:51:13.695602 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:14 crc kubenswrapper[4749]: I0310 15:51:14.606552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:14 crc kubenswrapper[4749]: E0310 15:51:14.607119 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:14 crc kubenswrapper[4749]: I0310 15:51:14.606702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:14 crc kubenswrapper[4749]: E0310 15:51:14.607431 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:14 crc kubenswrapper[4749]: I0310 15:51:14.606558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:14 crc kubenswrapper[4749]: E0310 15:51:14.607669 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:14 crc kubenswrapper[4749]: I0310 15:51:14.606718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:14 crc kubenswrapper[4749]: E0310 15:51:14.607858 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:16 crc kubenswrapper[4749]: I0310 15:51:16.606506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:16 crc kubenswrapper[4749]: I0310 15:51:16.606555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:16 crc kubenswrapper[4749]: E0310 15:51:16.606713 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:16 crc kubenswrapper[4749]: I0310 15:51:16.606817 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:16 crc kubenswrapper[4749]: E0310 15:51:16.606880 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:16 crc kubenswrapper[4749]: E0310 15:51:16.607024 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:16 crc kubenswrapper[4749]: I0310 15:51:16.607709 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:16 crc kubenswrapper[4749]: E0310 15:51:16.607819 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:16 crc kubenswrapper[4749]: I0310 15:51:16.608078 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 15:51:16 crc kubenswrapper[4749]: E0310 15:51:16.608267 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nvpsq_openshift-ovn-kubernetes(fac9a20c-b1f6-4bb2-a363-072abb3c04d2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" Mar 10 15:51:18 crc kubenswrapper[4749]: I0310 15:51:18.606590 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:18 crc kubenswrapper[4749]: I0310 15:51:18.606643 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:18 crc kubenswrapper[4749]: E0310 15:51:18.606747 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:18 crc kubenswrapper[4749]: I0310 15:51:18.606763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:18 crc kubenswrapper[4749]: I0310 15:51:18.606856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:18 crc kubenswrapper[4749]: E0310 15:51:18.607003 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:18 crc kubenswrapper[4749]: E0310 15:51:18.607132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:18 crc kubenswrapper[4749]: E0310 15:51:18.607250 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:18 crc kubenswrapper[4749]: E0310 15:51:18.698142 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:20 crc kubenswrapper[4749]: I0310 15:51:20.606489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:20 crc kubenswrapper[4749]: I0310 15:51:20.606542 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:20 crc kubenswrapper[4749]: I0310 15:51:20.606732 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:20 crc kubenswrapper[4749]: I0310 15:51:20.606801 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:20 crc kubenswrapper[4749]: E0310 15:51:20.606854 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:20 crc kubenswrapper[4749]: E0310 15:51:20.606919 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:20 crc kubenswrapper[4749]: E0310 15:51:20.607076 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:20 crc kubenswrapper[4749]: E0310 15:51:20.607191 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:22 crc kubenswrapper[4749]: I0310 15:51:22.606159 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:22 crc kubenswrapper[4749]: I0310 15:51:22.606248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:22 crc kubenswrapper[4749]: E0310 15:51:22.606316 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:22 crc kubenswrapper[4749]: I0310 15:51:22.606336 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:22 crc kubenswrapper[4749]: I0310 15:51:22.606546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:22 crc kubenswrapper[4749]: E0310 15:51:22.606648 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:22 crc kubenswrapper[4749]: E0310 15:51:22.606865 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:22 crc kubenswrapper[4749]: E0310 15:51:22.606983 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:23 crc kubenswrapper[4749]: E0310 15:51:23.698802 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:24 crc kubenswrapper[4749]: I0310 15:51:24.606419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:24 crc kubenswrapper[4749]: I0310 15:51:24.606419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:24 crc kubenswrapper[4749]: I0310 15:51:24.606827 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:24 crc kubenswrapper[4749]: E0310 15:51:24.607061 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:24 crc kubenswrapper[4749]: I0310 15:51:24.607177 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:24 crc kubenswrapper[4749]: E0310 15:51:24.607444 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:24 crc kubenswrapper[4749]: E0310 15:51:24.607511 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:24 crc kubenswrapper[4749]: E0310 15:51:24.607742 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.454140 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/1.log" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.455413 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/0.log" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.455472 4749 generic.go:334] "Generic (PLEG): container finished" podID="807d12f5-c95a-4a7e-91c5-128de3d2235c" containerID="5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd" exitCode=1 Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.455511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerDied","Data":"5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd"} Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.455571 4749 scope.go:117] "RemoveContainer" containerID="78400cf8402ad43c62489483c1f4d9b4362fe0d58689057f180b375122361518" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.456205 4749 scope.go:117] "RemoveContainer" containerID="5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd" Mar 10 15:51:26 crc kubenswrapper[4749]: E0310 15:51:26.456435 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gwpmf_openshift-multus(807d12f5-c95a-4a7e-91c5-128de3d2235c)\"" pod="openshift-multus/multus-gwpmf" podUID="807d12f5-c95a-4a7e-91c5-128de3d2235c" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.605601 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.605612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:26 crc kubenswrapper[4749]: E0310 15:51:26.606045 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.605648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:26 crc kubenswrapper[4749]: I0310 15:51:26.605620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:26 crc kubenswrapper[4749]: E0310 15:51:26.606170 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:26 crc kubenswrapper[4749]: E0310 15:51:26.606253 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:26 crc kubenswrapper[4749]: E0310 15:51:26.606307 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:27 crc kubenswrapper[4749]: I0310 15:51:27.464494 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/1.log" Mar 10 15:51:28 crc kubenswrapper[4749]: I0310 15:51:28.606216 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:28 crc kubenswrapper[4749]: I0310 15:51:28.606275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:28 crc kubenswrapper[4749]: I0310 15:51:28.606304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:28 crc kubenswrapper[4749]: I0310 15:51:28.606315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:28 crc kubenswrapper[4749]: E0310 15:51:28.606966 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:28 crc kubenswrapper[4749]: E0310 15:51:28.607275 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:28 crc kubenswrapper[4749]: E0310 15:51:28.607362 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:28 crc kubenswrapper[4749]: E0310 15:51:28.607438 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:28 crc kubenswrapper[4749]: E0310 15:51:28.700814 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:29 crc kubenswrapper[4749]: I0310 15:51:29.607302 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.445558 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpmqp"] Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.445814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:30 crc kubenswrapper[4749]: E0310 15:51:30.446035 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.478587 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/3.log" Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.482446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerStarted","Data":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.482922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.606234 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.606274 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:30 crc kubenswrapper[4749]: I0310 15:51:30.606308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:30 crc kubenswrapper[4749]: E0310 15:51:30.606441 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:30 crc kubenswrapper[4749]: E0310 15:51:30.606568 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:30 crc kubenswrapper[4749]: E0310 15:51:30.606773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:32 crc kubenswrapper[4749]: I0310 15:51:32.606712 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:32 crc kubenswrapper[4749]: I0310 15:51:32.606916 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:32 crc kubenswrapper[4749]: E0310 15:51:32.607236 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:32 crc kubenswrapper[4749]: I0310 15:51:32.607037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:32 crc kubenswrapper[4749]: I0310 15:51:32.606950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:32 crc kubenswrapper[4749]: E0310 15:51:32.607337 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:32 crc kubenswrapper[4749]: E0310 15:51:32.607487 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:32 crc kubenswrapper[4749]: E0310 15:51:32.607585 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:33 crc kubenswrapper[4749]: E0310 15:51:33.701517 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:34 crc kubenswrapper[4749]: I0310 15:51:34.605623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:34 crc kubenswrapper[4749]: E0310 15:51:34.605791 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:34 crc kubenswrapper[4749]: I0310 15:51:34.605622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:34 crc kubenswrapper[4749]: I0310 15:51:34.605931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:34 crc kubenswrapper[4749]: E0310 15:51:34.606015 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:34 crc kubenswrapper[4749]: I0310 15:51:34.605957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:34 crc kubenswrapper[4749]: E0310 15:51:34.606147 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:34 crc kubenswrapper[4749]: E0310 15:51:34.606183 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:36 crc kubenswrapper[4749]: I0310 15:51:36.605702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:36 crc kubenswrapper[4749]: I0310 15:51:36.605772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:36 crc kubenswrapper[4749]: I0310 15:51:36.605723 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:36 crc kubenswrapper[4749]: I0310 15:51:36.605703 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:36 crc kubenswrapper[4749]: E0310 15:51:36.605921 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:36 crc kubenswrapper[4749]: E0310 15:51:36.605991 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:36 crc kubenswrapper[4749]: E0310 15:51:36.609624 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:36 crc kubenswrapper[4749]: E0310 15:51:36.609819 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:38 crc kubenswrapper[4749]: I0310 15:51:38.606300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:38 crc kubenswrapper[4749]: I0310 15:51:38.606347 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:38 crc kubenswrapper[4749]: I0310 15:51:38.606316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:38 crc kubenswrapper[4749]: I0310 15:51:38.606322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:38 crc kubenswrapper[4749]: E0310 15:51:38.606522 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:38 crc kubenswrapper[4749]: I0310 15:51:38.606703 4749 scope.go:117] "RemoveContainer" containerID="5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd" Mar 10 15:51:38 crc kubenswrapper[4749]: E0310 15:51:38.606805 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:38 crc kubenswrapper[4749]: E0310 15:51:38.606874 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:38 crc kubenswrapper[4749]: E0310 15:51:38.606968 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:38 crc kubenswrapper[4749]: I0310 15:51:38.632068 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podStartSLOduration=157.63203473 podStartE2EDuration="2m37.63203473s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:30.514617045 +0000 UTC m=+187.636482742" watchObservedRunningTime="2026-03-10 15:51:38.63203473 +0000 UTC m=+195.753900417" Mar 10 15:51:38 crc kubenswrapper[4749]: E0310 15:51:38.702930 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:51:39 crc kubenswrapper[4749]: I0310 15:51:39.512223 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/1.log" Mar 10 15:51:39 crc kubenswrapper[4749]: I0310 15:51:39.512732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerStarted","Data":"750eab5b32a357211fac1cfd9b94b2e5c78d0358f83824912d275e65a6761fa0"} Mar 10 15:51:40 crc kubenswrapper[4749]: I0310 15:51:40.606625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:40 crc kubenswrapper[4749]: I0310 15:51:40.606701 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:40 crc kubenswrapper[4749]: I0310 15:51:40.606717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:40 crc kubenswrapper[4749]: I0310 15:51:40.606741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:40 crc kubenswrapper[4749]: E0310 15:51:40.609677 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:40 crc kubenswrapper[4749]: E0310 15:51:40.609869 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:40 crc kubenswrapper[4749]: E0310 15:51:40.609890 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:40 crc kubenswrapper[4749]: E0310 15:51:40.610091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:42 crc kubenswrapper[4749]: I0310 15:51:42.605664 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:42 crc kubenswrapper[4749]: I0310 15:51:42.605751 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:42 crc kubenswrapper[4749]: I0310 15:51:42.605793 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:42 crc kubenswrapper[4749]: I0310 15:51:42.605851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:42 crc kubenswrapper[4749]: E0310 15:51:42.605839 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:51:42 crc kubenswrapper[4749]: E0310 15:51:42.605981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:51:42 crc kubenswrapper[4749]: E0310 15:51:42.606067 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpmqp" podUID="cd3985af-f2c3-4f91-919e-2ea9420418b3" Mar 10 15:51:42 crc kubenswrapper[4749]: E0310 15:51:42.606118 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.606482 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.606561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.606684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.606713 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.609887 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.610644 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.612365 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.612572 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.612758 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 15:51:44 crc kubenswrapper[4749]: I0310 15:51:44.612969 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.897807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.941114 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vlrtg"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.941670 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.943617 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.944913 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.947251 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.947911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.949115 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.949905 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.951468 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9jhsg"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.952148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.952290 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.952914 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.955739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.957754 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.958488 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.958535 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.959051 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.961233 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ftjlh"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.961252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.961261 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.961963 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.968513 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.970065 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.970304 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dc569"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.971078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.971571 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5"] Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.971862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.987001 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.987347 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.987895 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.987988 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 15:51:47 crc kubenswrapper[4749]: I0310 15:51:47.988145 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.007569 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.007752 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.007795 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.012103 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.012202 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.015502 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sw57w"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.016250 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5fc2g"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.016570 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8ntsm"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.016982 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.017485 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.017824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.019125 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.023286 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.024022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.024126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.038249 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.039890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78tn\" (UniqueName: \"kubernetes.io/projected/b1cf171a-a434-42b9-a974-ee6627c12968-kube-api-access-d78tn\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.039949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-serving-cert\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.039989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57c81670-31ff-425b-ae62-bfdb5126a2ae-config\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e316a4-2927-490b-9ae2-5045827163d9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-config\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60ff7ae2-f62c-45f4-9f05-8030846bec81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-client-ca\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040145 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49r6l\" (UniqueName: \"kubernetes.io/projected/c2bb4a2b-973e-4925-9be8-f51899269e8c-kube-api-access-49r6l\") pod \"dns-operator-744455d44c-ftjlh\" (UID: \"c2bb4a2b-973e-4925-9be8-f51899269e8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57c81670-31ff-425b-ae62-bfdb5126a2ae-auth-proxy-config\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/57c81670-31ff-425b-ae62-bfdb5126a2ae-machine-approver-tls\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkjn4\" (UniqueName: \"kubernetes.io/projected/9bf60777-23f7-4d99-a70e-a0f4733c54b1-kube-api-access-fkjn4\") pod \"downloads-7954f5f757-9jhsg\" (UID: \"9bf60777-23f7-4d99-a70e-a0f4733c54b1\") " pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60ff7ae2-f62c-45f4-9f05-8030846bec81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040246 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-serving-cert\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrq7t\" (UniqueName: \"kubernetes.io/projected/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-kube-api-access-qrq7t\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-client-ca\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cf171a-a434-42b9-a974-ee6627c12968-serving-cert\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkjbm\" (UniqueName: \"kubernetes.io/projected/60ff7ae2-f62c-45f4-9f05-8030846bec81-kube-api-access-kkjbm\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-config\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb4tg\" (UniqueName: \"kubernetes.io/projected/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-kube-api-access-xb4tg\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040440 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2bb4a2b-973e-4925-9be8-f51899269e8c-metrics-tls\") pod \"dns-operator-744455d44c-ftjlh\" (UID: \"c2bb4a2b-973e-4925-9be8-f51899269e8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j65p4\" (UniqueName: \"kubernetes.io/projected/92e316a4-2927-490b-9ae2-5045827163d9-kube-api-access-j65p4\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgf6v\" (UniqueName: \"kubernetes.io/projected/57c81670-31ff-425b-ae62-bfdb5126a2ae-kube-api-access-rgf6v\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.040504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e316a4-2927-490b-9ae2-5045827163d9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.092791 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.093153 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.095634 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chntp"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.096065 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.096299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.096348 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.096789 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.096880 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.097086 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.097821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099145 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099446 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099661 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099773 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.099892 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.100052 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.100173 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.100791 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.103464 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.104224 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.104513 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.104905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.105124 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.105277 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.105512 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.112182 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.112460 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.112848 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.113064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.113163 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.113264 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.113770 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.114012 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.114177 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.115189 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.115674 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.115812 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.116662 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.116822 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.116849 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.116974 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.116997 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.117160 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.117360 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.117572 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.118333 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.118519 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.118655 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.118806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.118953 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119333 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119755 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.119997 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.124967 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g6chs"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.132221 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2m4f"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.134161 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.136620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.127196 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.137543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.150710 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.160990 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pgr7"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161359 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60ff7ae2-f62c-45f4-9f05-8030846bec81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-images\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-client-ca\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161835 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49r6l\" (UniqueName: \"kubernetes.io/projected/c2bb4a2b-973e-4925-9be8-f51899269e8c-kube-api-access-49r6l\") pod \"dns-operator-744455d44c-ftjlh\" (UID: \"c2bb4a2b-973e-4925-9be8-f51899269e8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57c81670-31ff-425b-ae62-bfdb5126a2ae-auth-proxy-config\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161945 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/57c81670-31ff-425b-ae62-bfdb5126a2ae-machine-approver-tls\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkjn4\" (UniqueName: \"kubernetes.io/projected/9bf60777-23f7-4d99-a70e-a0f4733c54b1-kube-api-access-fkjn4\") pod \"downloads-7954f5f757-9jhsg\" (UID: \"9bf60777-23f7-4d99-a70e-a0f4733c54b1\") " pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.161990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjb5s\" (UniqueName: \"kubernetes.io/projected/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-kube-api-access-wjb5s\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739717e7-ef4a-4032-82be-88a95648f3fe-config\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg962\" (UniqueName: \"kubernetes.io/projected/9cb2105c-0588-48ef-a5c4-fff4723946a7-kube-api-access-bg962\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60ff7ae2-f62c-45f4-9f05-8030846bec81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162072 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-serving-cert\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162088 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/739717e7-ef4a-4032-82be-88a95648f3fe-trusted-ca\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb7pj\" (UniqueName: \"kubernetes.io/projected/5620f312-7196-4598-8c73-361e4784362d-kube-api-access-kb7pj\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162076 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv252\" (UniqueName: \"kubernetes.io/projected/739717e7-ef4a-4032-82be-88a95648f3fe-kube-api-access-sv252\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrq7t\" (UniqueName: \"kubernetes.io/projected/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-kube-api-access-qrq7t\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-client-ca\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-etcd-client\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162341 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5jl9\" (UniqueName: \"kubernetes.io/projected/9a8388aa-0189-449e-9fbd-71eeb26b1643-kube-api-access-v5jl9\") pod \"cluster-samples-operator-665b6dd947-b5rxg\" (UID: \"9a8388aa-0189-449e-9fbd-71eeb26b1643\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5620f312-7196-4598-8c73-361e4784362d-audit-dir\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162424 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-config\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a8388aa-0189-449e-9fbd-71eeb26b1643-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b5rxg\" (UID: \"9a8388aa-0189-449e-9fbd-71eeb26b1643\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-client\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-serving-cert\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb2105c-0588-48ef-a5c4-fff4723946a7-serving-cert\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cf171a-a434-42b9-a974-ee6627c12968-serving-cert\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-config\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkjbm\" (UniqueName: \"kubernetes.io/projected/60ff7ae2-f62c-45f4-9f05-8030846bec81-kube-api-access-kkjbm\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162717 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-config\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb4tg\" (UniqueName: \"kubernetes.io/projected/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-kube-api-access-xb4tg\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-ca\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2bb4a2b-973e-4925-9be8-f51899269e8c-metrics-tls\") pod \"dns-operator-744455d44c-ftjlh\" (UID: \"c2bb4a2b-973e-4925-9be8-f51899269e8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-service-ca-bundle\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162896 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb477892-41a4-4a6b-a006-d01eaf5bc502-serving-cert\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcl6b\" (UniqueName: \"kubernetes.io/projected/bb477892-41a4-4a6b-a006-d01eaf5bc502-kube-api-access-dcl6b\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j65p4\" (UniqueName: \"kubernetes.io/projected/92e316a4-2927-490b-9ae2-5045827163d9-kube-api-access-j65p4\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgf6v\" (UniqueName: \"kubernetes.io/projected/57c81670-31ff-425b-ae62-bfdb5126a2ae-kube-api-access-rgf6v\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e316a4-2927-490b-9ae2-5045827163d9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-config\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-audit-policies\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d78tn\" (UniqueName: \"kubernetes.io/projected/b1cf171a-a434-42b9-a974-ee6627c12968-kube-api-access-d78tn\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-serving-cert\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-encryption-config\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57c81670-31ff-425b-ae62-bfdb5126a2ae-config\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e316a4-2927-490b-9ae2-5045827163d9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-config\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/739717e7-ef4a-4032-82be-88a95648f3fe-serving-cert\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-service-ca\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.163871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-client-ca\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.164114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.170409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-client-ca\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.162872 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.177114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-config\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.179137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.181444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57c81670-31ff-425b-ae62-bfdb5126a2ae-auth-proxy-config\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.181963 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.182132 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.171016 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.182931 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.184444 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.184585 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.184717 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.184827 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.184941 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.185311 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.188618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.189717 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.192268 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-96msp"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.192568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.192683 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.193047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.194821 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.195281 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.195988 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.196409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.196952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.197444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.197847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.198178 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.198719 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.198990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.200115 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.202074 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.202760 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.202935 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.203100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.207193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-config\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.207334 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6g5bk"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.207847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-serving-cert\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.208157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.208165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/57c81670-31ff-425b-ae62-bfdb5126a2ae-machine-approver-tls\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.209026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e316a4-2927-490b-9ae2-5045827163d9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.209206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-serving-cert\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.209257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e316a4-2927-490b-9ae2-5045827163d9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.210079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.215096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57c81670-31ff-425b-ae62-bfdb5126a2ae-config\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.216559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2bb4a2b-973e-4925-9be8-f51899269e8c-metrics-tls\") pod \"dns-operator-744455d44c-ftjlh\" (UID: \"c2bb4a2b-973e-4925-9be8-f51899269e8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.216845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cf171a-a434-42b9-a974-ee6627c12968-serving-cert\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.217337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60ff7ae2-f62c-45f4-9f05-8030846bec81-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.218861 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.219937 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.220295 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrq7t\" (UniqueName: \"kubernetes.io/projected/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-kube-api-access-qrq7t\") pod \"controller-manager-879f6c89f-vlrtg\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.220555 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.224286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60ff7ae2-f62c-45f4-9f05-8030846bec81-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.224802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.225000 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.225435 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.225705 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552630-vvkbm"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.226420 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.226719 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.226700 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65p4\" (UniqueName: \"kubernetes.io/projected/92e316a4-2927-490b-9ae2-5045827163d9-kube-api-access-j65p4\") pod \"openshift-apiserver-operator-796bbdcf4f-bswh5\" (UID: \"92e316a4-2927-490b-9ae2-5045827163d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.226984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkjn4\" (UniqueName: \"kubernetes.io/projected/9bf60777-23f7-4d99-a70e-a0f4733c54b1-kube-api-access-fkjn4\") pod \"downloads-7954f5f757-9jhsg\" (UID: \"9bf60777-23f7-4d99-a70e-a0f4733c54b1\") " pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.228265 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-czdm4"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.228749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.228747 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49r6l\" (UniqueName: \"kubernetes.io/projected/c2bb4a2b-973e-4925-9be8-f51899269e8c-kube-api-access-49r6l\") pod \"dns-operator-744455d44c-ftjlh\" (UID: \"c2bb4a2b-973e-4925-9be8-f51899269e8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.246796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkjbm\" (UniqueName: \"kubernetes.io/projected/60ff7ae2-f62c-45f4-9f05-8030846bec81-kube-api-access-kkjbm\") pod \"openshift-controller-manager-operator-756b6f6bc6-n2bhb\" (UID: \"60ff7ae2-f62c-45f4-9f05-8030846bec81\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.248585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78tn\" (UniqueName: \"kubernetes.io/projected/b1cf171a-a434-42b9-a974-ee6627c12968-kube-api-access-d78tn\") pod \"route-controller-manager-6576b87f9c-6qfk7\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.249117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgf6v\" (UniqueName: \"kubernetes.io/projected/57c81670-31ff-425b-ae62-bfdb5126a2ae-kube-api-access-rgf6v\") pod \"machine-approver-56656f9798-dh8rh\" (UID: \"57c81670-31ff-425b-ae62-bfdb5126a2ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.253118 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.255072 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.256147 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.256207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb4tg\" (UniqueName: \"kubernetes.io/projected/3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe-kube-api-access-xb4tg\") pod \"openshift-config-operator-7777fb866f-dc569\" (UID: \"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.259113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.260150 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.263003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.264628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-images\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.264701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25940433-fa76-4378-87b1-fb387be619ec-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.264941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9a0fb55b-db13-4353-b4cc-253386c29267-signing-key\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-config\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrf7c\" (UniqueName: \"kubernetes.io/projected/18f8edee-4182-4211-9036-f087d4d08f90-kube-api-access-rrf7c\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-node-pullsecrets\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjb5s\" (UniqueName: \"kubernetes.io/projected/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-kube-api-access-wjb5s\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad950382-6e38-44e0-b037-7d970035d8ca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-metrics-certs\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739717e7-ef4a-4032-82be-88a95648f3fe-config\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg962\" (UniqueName: \"kubernetes.io/projected/9cb2105c-0588-48ef-a5c4-fff4723946a7-kube-api-access-bg962\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.265948 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-default-certificate\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.266198 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.266220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-audit-dir\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.266175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-images\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/739717e7-ef4a-4032-82be-88a95648f3fe-trusted-ca\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270460 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsnrd\" (UniqueName: \"kubernetes.io/projected/ad950382-6e38-44e0-b037-7d970035d8ca-kube-api-access-xsnrd\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb7pj\" (UniqueName: \"kubernetes.io/projected/5620f312-7196-4598-8c73-361e4784362d-kube-api-access-kb7pj\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9gt\" (UniqueName: \"kubernetes.io/projected/a31d7167-46e8-4c6f-b511-a4a86aa908f2-kube-api-access-zr9gt\") pod \"control-plane-machine-set-operator-78cbb6b69f-sqn6s\" (UID: \"a31d7167-46e8-4c6f-b511-a4a86aa908f2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-images\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-encryption-config\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpqhd\" (UniqueName: \"kubernetes.io/projected/9a0fb55b-db13-4353-b4cc-253386c29267-kube-api-access-vpqhd\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f041ecda-15b2-423a-9ceb-0edbf02db58f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-etcd-client\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-image-import-ca\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv252\" (UniqueName: \"kubernetes.io/projected/739717e7-ef4a-4032-82be-88a95648f3fe-kube-api-access-sv252\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.270982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9a0fb55b-db13-4353-b4cc-253386c29267-signing-cabundle\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.271015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.271088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f8edee-4182-4211-9036-f087d4d08f90-service-ca-bundle\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.271134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t75mz\" (UniqueName: \"kubernetes.io/projected/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-kube-api-access-t75mz\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.271186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-etcd-client\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.271230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a31d7167-46e8-4c6f-b511-a4a86aa908f2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sqn6s\" (UID: \"a31d7167-46e8-4c6f-b511-a4a86aa908f2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.272810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5jl9\" (UniqueName: \"kubernetes.io/projected/9a8388aa-0189-449e-9fbd-71eeb26b1643-kube-api-access-v5jl9\") pod \"cluster-samples-operator-665b6dd947-b5rxg\" (UID: \"9a8388aa-0189-449e-9fbd-71eeb26b1643\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.274602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/739717e7-ef4a-4032-82be-88a95648f3fe-trusted-ca\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.284056 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q8p7p"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.290455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5620f312-7196-4598-8c73-361e4784362d-audit-dir\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.284954 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.285968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.290729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5620f312-7196-4598-8c73-361e4784362d-audit-dir\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.286326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739717e7-ef4a-4032-82be-88a95648f3fe-config\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.292006 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gvtf7"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.292621 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.293923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.294312 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.294521 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.290772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25940433-fa76-4378-87b1-fb387be619ec-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.294950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-audit\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-config\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25940433-fa76-4378-87b1-fb387be619ec-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f041ecda-15b2-423a-9ceb-0edbf02db58f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295257 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-serving-cert\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-stats-auth\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a8388aa-0189-449e-9fbd-71eeb26b1643-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b5rxg\" (UID: \"9a8388aa-0189-449e-9fbd-71eeb26b1643\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295649 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dvzk"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-client\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-serving-cert\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-etcd-serving-ca\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-config\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb2105c-0588-48ef-a5c4-fff4723946a7-serving-cert\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-config\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f041ecda-15b2-423a-9ceb-0edbf02db58f-config\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad950382-6e38-44e0-b037-7d970035d8ca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-ca\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.295992 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-proxy-tls\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-service-ca-bundle\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb477892-41a4-4a6b-a006-d01eaf5bc502-serving-cert\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcl6b\" (UniqueName: \"kubernetes.io/projected/bb477892-41a4-4a6b-a006-d01eaf5bc502-kube-api-access-dcl6b\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kckv\" (UniqueName: \"kubernetes.io/projected/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-kube-api-access-4kckv\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad950382-6e38-44e0-b037-7d970035d8ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-config\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3adcc561-fe01-4280-9bdc-d650e3fa8b44-proxy-tls\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3adcc561-fe01-4280-9bdc-d650e3fa8b44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtvj\" (UniqueName: \"kubernetes.io/projected/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-kube-api-access-pbtvj\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-config\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-audit-policies\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xvx\" (UniqueName: \"kubernetes.io/projected/3adcc561-fe01-4280-9bdc-d650e3fa8b44-kube-api-access-86xvx\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-encryption-config\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/739717e7-ef4a-4032-82be-88a95648f3fe-serving-cert\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-service-ca\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296475 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.296984 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.297037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.297665 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.297749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.298346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-etcd-client\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.298437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-config\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.298767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.298929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.299136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-config\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.299678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5620f312-7196-4598-8c73-361e4784362d-audit-policies\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.300048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb477892-41a4-4a6b-a006-d01eaf5bc502-service-ca-bundle\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.300368 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vlrtg"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.300568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-ca\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.301473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-serving-cert\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.301852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-service-ca\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.302399 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.303649 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.305133 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.305345 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sw57w"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.306132 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dc569"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.307043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/739717e7-ef4a-4032-82be-88a95648f3fe-serving-cert\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.308281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9jhsg"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.309428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9cb2105c-0588-48ef-a5c4-fff4723946a7-etcd-client\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.309803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb477892-41a4-4a6b-a006-d01eaf5bc502-serving-cert\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.310447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cb2105c-0588-48ef-a5c4-fff4723946a7-serving-cert\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.310602 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chntp"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.311427 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ftjlh"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.311523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.311696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a8388aa-0189-449e-9fbd-71eeb26b1643-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-b5rxg\" (UID: \"9a8388aa-0189-449e-9fbd-71eeb26b1643\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.312550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5620f312-7196-4598-8c73-361e4784362d-encryption-config\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.315538 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.317596 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-96msp"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.320102 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.322111 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.323053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8ntsm"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.324363 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.326021 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5fc2g"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.327932 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.329577 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2m4f"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.332969 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.334616 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.339400 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pgr7"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.342263 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g6chs"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.343968 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.344495 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.345554 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.345591 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-vvkbm"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.347845 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q8p7p"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.349274 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wbdhx"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.351193 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xlbcp"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.351667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.364221 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.365034 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.365626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.365702 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.365716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.365879 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.368854 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.375866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.380689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.385162 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.385476 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.387968 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.389359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wbdhx"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.389968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.392978 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.393035 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dnk99"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.394785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.394793 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad950382-6e38-44e0-b037-7d970035d8ca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-metrics-certs\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-default-certificate\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-audit-dir\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsnrd\" (UniqueName: \"kubernetes.io/projected/ad950382-6e38-44e0-b037-7d970035d8ca-kube-api-access-xsnrd\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9gt\" (UniqueName: \"kubernetes.io/projected/a31d7167-46e8-4c6f-b511-a4a86aa908f2-kube-api-access-zr9gt\") pod \"control-plane-machine-set-operator-78cbb6b69f-sqn6s\" (UID: \"a31d7167-46e8-4c6f-b511-a4a86aa908f2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-images\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-encryption-config\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f041ecda-15b2-423a-9ceb-0edbf02db58f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-etcd-client\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-image-import-ca\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpqhd\" (UniqueName: \"kubernetes.io/projected/9a0fb55b-db13-4353-b4cc-253386c29267-kube-api-access-vpqhd\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9a0fb55b-db13-4353-b4cc-253386c29267-signing-cabundle\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f8edee-4182-4211-9036-f087d4d08f90-service-ca-bundle\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.397978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t75mz\" (UniqueName: \"kubernetes.io/projected/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-kube-api-access-t75mz\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a31d7167-46e8-4c6f-b511-a4a86aa908f2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sqn6s\" (UID: \"a31d7167-46e8-4c6f-b511-a4a86aa908f2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25940433-fa76-4378-87b1-fb387be619ec-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-audit\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25940433-fa76-4378-87b1-fb387be619ec-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f041ecda-15b2-423a-9ceb-0edbf02db58f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-serving-cert\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-stats-auth\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-etcd-serving-ca\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f041ecda-15b2-423a-9ceb-0edbf02db58f-config\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad950382-6e38-44e0-b037-7d970035d8ca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398256 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-proxy-tls\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kckv\" (UniqueName: \"kubernetes.io/projected/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-kube-api-access-4kckv\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad950382-6e38-44e0-b037-7d970035d8ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-config\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3adcc561-fe01-4280-9bdc-d650e3fa8b44-proxy-tls\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3adcc561-fe01-4280-9bdc-d650e3fa8b44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-audit-dir\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtvj\" (UniqueName: \"kubernetes.io/projected/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-kube-api-access-pbtvj\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xvx\" (UniqueName: \"kubernetes.io/projected/3adcc561-fe01-4280-9bdc-d650e3fa8b44-kube-api-access-86xvx\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25940433-fa76-4378-87b1-fb387be619ec-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9a0fb55b-db13-4353-b4cc-253386c29267-signing-key\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-config\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrf7c\" (UniqueName: \"kubernetes.io/projected/18f8edee-4182-4211-9036-f087d4d08f90-kube-api-access-rrf7c\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-node-pullsecrets\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.398741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-node-pullsecrets\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.399282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad950382-6e38-44e0-b037-7d970035d8ca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.399503 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gvtf7"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.400303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.400572 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.401742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.401783 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-czdm4"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.404900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.406817 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.406867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dvzk"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.406879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dnk99"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.407721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f041ecda-15b2-423a-9ceb-0edbf02db58f-config\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.408504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-etcd-serving-ca\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.409298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3adcc561-fe01-4280-9bdc-d650e3fa8b44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.409324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-image-import-ca\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.410484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-config\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.410610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-audit\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.410658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3adcc561-fe01-4280-9bdc-d650e3fa8b44-proxy-tls\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.411204 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xlbcp"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.412366 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-878z5"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.412523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a31d7167-46e8-4c6f-b511-a4a86aa908f2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sqn6s\" (UID: \"a31d7167-46e8-4c6f-b511-a4a86aa908f2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.413192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.413606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-etcd-client\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.414141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-serving-cert\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.414872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-encryption-config\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.415207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad950382-6e38-44e0-b037-7d970035d8ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.416791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f041ecda-15b2-423a-9ceb-0edbf02db58f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.423014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.425932 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.430079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.445851 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.461200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9a0fb55b-db13-4353-b4cc-253386c29267-signing-key\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.475724 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.480135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9a0fb55b-db13-4353-b4cc-253386c29267-signing-cabundle\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.487780 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.504964 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.524536 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.532334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-images\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.546662 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.564790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25940433-fa76-4378-87b1-fb387be619ec-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.567432 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vlrtg"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.567793 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.568597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" event={"ID":"57c81670-31ff-425b-ae62-bfdb5126a2ae","Type":"ContainerStarted","Data":"f30adaa74e560f9f35fc2caaf4281af042d6bbe3a1bc27226f1d7777dccd4aef"} Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.585975 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.605295 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.616973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-proxy-tls\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.625577 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.631676 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25940433-fa76-4378-87b1-fb387be619ec-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.644840 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.665883 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.685690 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.704397 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.715283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.726754 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.732357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-config\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.744835 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.764780 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.784648 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.795038 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5"] Mar 10 15:51:48 crc kubenswrapper[4749]: W0310 15:51:48.807009 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e316a4_2927_490b_9ae2_5045827163d9.slice/crio-8329c9c5fcaf51105c199ab9fa9fb9af12629510d857f584cda87fa2e61ff48d WatchSource:0}: Error finding container 8329c9c5fcaf51105c199ab9fa9fb9af12629510d857f584cda87fa2e61ff48d: Status 404 returned error can't find the container with id 8329c9c5fcaf51105c199ab9fa9fb9af12629510d857f584cda87fa2e61ff48d Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.822599 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.848849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9jhsg"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.849961 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.850264 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.865076 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.874768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-metrics-certs\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.884788 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.918505 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.927237 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-default-certificate\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.933191 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.944850 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ftjlh"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.945301 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.945690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f8edee-4182-4211-9036-f087d4d08f90-stats-auth\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.954507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb"] Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.955722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f8edee-4182-4211-9036-f087d4d08f90-service-ca-bundle\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.964938 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 15:51:48 crc kubenswrapper[4749]: I0310 15:51:48.977415 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dc569"] Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.005135 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.024947 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.046693 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.064486 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.086503 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.104399 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.125039 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.145923 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.165154 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.185854 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.205648 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.225694 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.242487 4749 request.go:700] Waited for 1.013402041s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.245191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.265080 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.284861 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.304733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.325187 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.345644 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.365218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.385618 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.406054 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.424833 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.465322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjb5s\" (UniqueName: \"kubernetes.io/projected/1665ca47-2d24-469b-b53f-4d6b1b5b24c4-kube-api-access-wjb5s\") pod \"machine-api-operator-5694c8668f-8ntsm\" (UID: \"1665ca47-2d24-469b-b53f-4d6b1b5b24c4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.481042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg962\" (UniqueName: \"kubernetes.io/projected/9cb2105c-0588-48ef-a5c4-fff4723946a7-kube-api-access-bg962\") pod \"etcd-operator-b45778765-sw57w\" (UID: \"9cb2105c-0588-48ef-a5c4-fff4723946a7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.507313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb7pj\" (UniqueName: \"kubernetes.io/projected/5620f312-7196-4598-8c73-361e4784362d-kube-api-access-kb7pj\") pod \"apiserver-7bbb656c7d-zm66b\" (UID: \"5620f312-7196-4598-8c73-361e4784362d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.529762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv252\" (UniqueName: \"kubernetes.io/projected/739717e7-ef4a-4032-82be-88a95648f3fe-kube-api-access-sv252\") pod \"console-operator-58897d9998-5fc2g\" (UID: \"739717e7-ef4a-4032-82be-88a95648f3fe\") " pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.542154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5jl9\" (UniqueName: \"kubernetes.io/projected/9a8388aa-0189-449e-9fbd-71eeb26b1643-kube-api-access-v5jl9\") pod \"cluster-samples-operator-665b6dd947-b5rxg\" (UID: \"9a8388aa-0189-449e-9fbd-71eeb26b1643\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.544613 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.565234 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.575351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" event={"ID":"92e316a4-2927-490b-9ae2-5045827163d9","Type":"ContainerStarted","Data":"2ace6f5f8cc500dace2f8c7f0d8f126ea068d164d6a8b7860407fd4c3c4b3551"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.575495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" event={"ID":"92e316a4-2927-490b-9ae2-5045827163d9","Type":"ContainerStarted","Data":"8329c9c5fcaf51105c199ab9fa9fb9af12629510d857f584cda87fa2e61ff48d"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.577004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" event={"ID":"57c81670-31ff-425b-ae62-bfdb5126a2ae","Type":"ContainerStarted","Data":"ddceb87c8d2b575dc7cbb7cd5eb426faa1e0a77bfafdebb01bd6ee18c1d52e4b"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.577033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" event={"ID":"57c81670-31ff-425b-ae62-bfdb5126a2ae","Type":"ContainerStarted","Data":"5c4d661950bde28f467ef34c1d740c75d32a63cfd90a602afd265bfc0b394683"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.579746 4749 generic.go:334] "Generic (PLEG): container finished" podID="3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe" containerID="30b4027af7fadccec8b6357740955d8ab94f42a2e35ee9cbf26c8e906b9ee7e6" exitCode=0 Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.579822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" event={"ID":"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe","Type":"ContainerDied","Data":"30b4027af7fadccec8b6357740955d8ab94f42a2e35ee9cbf26c8e906b9ee7e6"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.579847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" event={"ID":"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe","Type":"ContainerStarted","Data":"1e84b44aaac668ff83e70b58d2158bf53975f7af6f17f99aa14d5716066e1b0c"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.584908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" event={"ID":"c2bb4a2b-973e-4925-9be8-f51899269e8c","Type":"ContainerStarted","Data":"82ccb0c9f3b9800f3f1537d1f728eac6fbcd8b93d344f1855794fa9a3bb1c43c"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.584954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" event={"ID":"c2bb4a2b-973e-4925-9be8-f51899269e8c","Type":"ContainerStarted","Data":"98a4d3e1bcc9314145d3d7ec7c00286341a60fe2cab9c2238937cb03fd11e9db"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.584975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" event={"ID":"c2bb4a2b-973e-4925-9be8-f51899269e8c","Type":"ContainerStarted","Data":"b8673d73f232697cbff299538c510fa943b9ffe1aad7bc89af642588087660b5"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.587746 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.590764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" event={"ID":"b1cf171a-a434-42b9-a974-ee6627c12968","Type":"ContainerStarted","Data":"0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.590827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" event={"ID":"b1cf171a-a434-42b9-a974-ee6627c12968","Type":"ContainerStarted","Data":"1e53474cc1d573a0caae08534745a6b01f15aed96daf4be0a36b24607a39cada"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.591073 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.594001 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6qfk7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.593992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" event={"ID":"60ff7ae2-f62c-45f4-9f05-8030846bec81","Type":"ContainerStarted","Data":"4ca217f6c36882bc92619704beec74c194b9d51a6c3bf11c946c068293d2a447"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.594109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" event={"ID":"60ff7ae2-f62c-45f4-9f05-8030846bec81","Type":"ContainerStarted","Data":"436e066e42edf988221ba2dcae10bb52f157749d5bafaa459fc9b695bfa4f8b7"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.594190 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" podUID="b1cf171a-a434-42b9-a974-ee6627c12968" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.597014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9jhsg" event={"ID":"9bf60777-23f7-4d99-a70e-a0f4733c54b1","Type":"ContainerStarted","Data":"db78049aeff565164771430a9234486cf799edc54c63accaa869c91ba7c7d0ca"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.597065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9jhsg" event={"ID":"9bf60777-23f7-4d99-a70e-a0f4733c54b1","Type":"ContainerStarted","Data":"0c97fa35e168553f05f206282c9f65ed7d116aa7c35c01efd307338fbffb2a20"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.597409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.597739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.599479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" event={"ID":"2a0a1b4c-1c4c-4834-8321-ac4b86673e99","Type":"ContainerStarted","Data":"d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.599514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" event={"ID":"2a0a1b4c-1c4c-4834-8321-ac4b86673e99","Type":"ContainerStarted","Data":"faf1e496257d6dd838f9222b307d5f14e86d206cf72d8b5113f0d9d29e6329a7"} Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.600126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.600978 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-9jhsg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.601055 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9jhsg" podUID="9bf60777-23f7-4d99-a70e-a0f4733c54b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.602046 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vlrtg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.602153 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" podUID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.604764 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.617327 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.623601 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.645330 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.669313 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.674564 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.684705 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.709760 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.718041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.729201 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.746434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.764923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.766602 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.788068 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.809519 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.824415 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.849485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sw57w"] Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.851830 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.880789 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.888171 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.910995 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.926178 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.933867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8ntsm"] Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.950667 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.985766 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:51:49 crc kubenswrapper[4749]: I0310 15:51:49.991290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcl6b\" (UniqueName: \"kubernetes.io/projected/bb477892-41a4-4a6b-a006-d01eaf5bc502-kube-api-access-dcl6b\") pod \"authentication-operator-69f744f599-chntp\" (UID: \"bb477892-41a4-4a6b-a006-d01eaf5bc502\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.006402 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.029269 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.038129 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5fc2g"] Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.049730 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.073429 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.086113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg"] Mar 10 15:51:50 crc kubenswrapper[4749]: W0310 15:51:50.086238 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod739717e7_ef4a_4032_82be_88a95648f3fe.slice/crio-396b3eeb75da5ec5a6f9a65cb9601fb7e15a8108e1fcb59cfb486898119f1a3a WatchSource:0}: Error finding container 396b3eeb75da5ec5a6f9a65cb9601fb7e15a8108e1fcb59cfb486898119f1a3a: Status 404 returned error can't find the container with id 396b3eeb75da5ec5a6f9a65cb9601fb7e15a8108e1fcb59cfb486898119f1a3a Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.086444 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.091304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.109940 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.125208 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.145460 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.170922 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.189965 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.215937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b"] Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.219066 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.219398 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41254: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.224924 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.242534 4749 request.go:700] Waited for 1.847392324s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.246140 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 15:51:50 crc kubenswrapper[4749]: W0310 15:51:50.253241 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5620f312_7196_4598_8c73_361e4784362d.slice/crio-c4b9197a473cd1b0fb7bb26116dfe7ae2fcdbd8a06776845448ee4eb91b03097 WatchSource:0}: Error finding container c4b9197a473cd1b0fb7bb26116dfe7ae2fcdbd8a06776845448ee4eb91b03097: Status 404 returned error can't find the container with id c4b9197a473cd1b0fb7bb26116dfe7ae2fcdbd8a06776845448ee4eb91b03097 Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.265603 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.285209 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.320643 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41262: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.331902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpqhd\" (UniqueName: \"kubernetes.io/projected/9a0fb55b-db13-4353-b4cc-253386c29267-kube-api-access-vpqhd\") pod \"service-ca-9c57cc56f-96msp\" (UID: \"9a0fb55b-db13-4353-b4cc-253386c29267\") " pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.349128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsnrd\" (UniqueName: \"kubernetes.io/projected/ad950382-6e38-44e0-b037-7d970035d8ca-kube-api-access-xsnrd\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.387511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xvx\" (UniqueName: \"kubernetes.io/projected/3adcc561-fe01-4280-9bdc-d650e3fa8b44-kube-api-access-86xvx\") pod \"machine-config-controller-84d6567774-v76rs\" (UID: \"3adcc561-fe01-4280-9bdc-d650e3fa8b44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.388727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9gt\" (UniqueName: \"kubernetes.io/projected/a31d7167-46e8-4c6f-b511-a4a86aa908f2-kube-api-access-zr9gt\") pod \"control-plane-machine-set-operator-78cbb6b69f-sqn6s\" (UID: \"a31d7167-46e8-4c6f-b511-a4a86aa908f2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.424826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kckv\" (UniqueName: \"kubernetes.io/projected/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-kube-api-access-4kckv\") pod \"marketplace-operator-79b997595-2pgr7\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.427775 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41276: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.428339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrf7c\" (UniqueName: \"kubernetes.io/projected/18f8edee-4182-4211-9036-f087d4d08f90-kube-api-access-rrf7c\") pod \"router-default-5444994796-6g5bk\" (UID: \"18f8edee-4182-4211-9036-f087d4d08f90\") " pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.448119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f041ecda-15b2-423a-9ceb-0edbf02db58f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-92jwt\" (UID: \"f041ecda-15b2-423a-9ceb-0edbf02db58f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.451444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.460829 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.473095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtvj\" (UniqueName: \"kubernetes.io/projected/04b00aff-5cc3-4d6f-947f-1f73bc45ad32-kube-api-access-pbtvj\") pod \"machine-config-operator-74547568cd-gs6kq\" (UID: \"04b00aff-5cc3-4d6f-947f-1f73bc45ad32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.476967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.488636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.500071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t75mz\" (UniqueName: \"kubernetes.io/projected/9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7-kube-api-access-t75mz\") pod \"apiserver-76f77b778f-g6chs\" (UID: \"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7\") " pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.515246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31a8586b-f1b9-4b1a-b406-6d88768b4cf5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gqv9f\" (UID: \"31a8586b-f1b9-4b1a-b406-6d88768b4cf5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.516337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-96msp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.518157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chntp"] Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.521622 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41292: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.544502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad950382-6e38-44e0-b037-7d970035d8ca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pzbmb\" (UID: \"ad950382-6e38-44e0-b037-7d970035d8ca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.546017 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.561702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25940433-fa76-4378-87b1-fb387be619ec-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4mlvz\" (UID: \"25940433-fa76-4378-87b1-fb387be619ec\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.566972 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 15:51:50 crc kubenswrapper[4749]: W0310 15:51:50.593989 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb477892_41a4_4a6b_a006_d01eaf5bc502.slice/crio-0afb07696380500335aa5c6d98d831355ef6bc948d76456318d01948a0bf689f WatchSource:0}: Error finding container 0afb07696380500335aa5c6d98d831355ef6bc948d76456318d01948a0bf689f: Status 404 returned error can't find the container with id 0afb07696380500335aa5c6d98d831355ef6bc948d76456318d01948a0bf689f Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.594318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.595450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.605026 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.626409 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41302: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.644507 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-trusted-ca\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9098e467-df40-4bb8-bd7c-639d6e59ca82-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljj6\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-kube-api-access-mljj6\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-certificates\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dn4d\" (UniqueName: \"kubernetes.io/projected/9098e467-df40-4bb8-bd7c-639d6e59ca82-kube-api-access-2dn4d\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9098e467-df40-4bb8-bd7c-639d6e59ca82-metrics-tls\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.665813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.681688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.681749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9098e467-df40-4bb8-bd7c-639d6e59ca82-trusted-ca\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.681832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.681877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-bound-sa-token\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.681922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-tls\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: E0310 15:51:50.682812 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.182793785 +0000 UTC m=+208.304659472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.684707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" event={"ID":"bb477892-41a4-4a6b-a006-d01eaf5bc502","Type":"ContainerStarted","Data":"0afb07696380500335aa5c6d98d831355ef6bc948d76456318d01948a0bf689f"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.697857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" event={"ID":"739717e7-ef4a-4032-82be-88a95648f3fe","Type":"ContainerStarted","Data":"d9479d2711defa70af8d0b94fda0934b022748db95bf79a1dd3055e4485a7cfb"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.697927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" event={"ID":"739717e7-ef4a-4032-82be-88a95648f3fe","Type":"ContainerStarted","Data":"396b3eeb75da5ec5a6f9a65cb9601fb7e15a8108e1fcb59cfb486898119f1a3a"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.703202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.705076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.707672 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-5fc2g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.707764 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" podUID="739717e7-ef4a-4032-82be-88a95648f3fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.715924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" event={"ID":"9cb2105c-0588-48ef-a5c4-fff4723946a7","Type":"ContainerStarted","Data":"a0e8d1f6efb401d2bd3098340a35641ea83aa220b511a8de3599b9aaa4e1505e"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.716005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" event={"ID":"9cb2105c-0588-48ef-a5c4-fff4723946a7","Type":"ContainerStarted","Data":"99474f20467b4e6a2d4327db620f5ba5b335bbb62c208d2d9a36475f93f8ca42"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.754913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" event={"ID":"5620f312-7196-4598-8c73-361e4784362d","Type":"ContainerStarted","Data":"c4b9197a473cd1b0fb7bb26116dfe7ae2fcdbd8a06776845448ee4eb91b03097"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.769957 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41308: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.782904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8skvg\" (UniqueName: \"kubernetes.io/projected/a875d292-8a92-4500-87a4-84aa079141e5-kube-api-access-8skvg\") pod \"ingress-canary-dnk99\" (UID: \"a875d292-8a92-4500-87a4-84aa079141e5\") " pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-bound-sa-token\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/60a5f517-7a57-4ff9-b1f7-c8a932a21649-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-tls\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-socket-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-mountpoint-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxng\" (UniqueName: \"kubernetes.io/projected/404512a7-dc95-419b-a631-2384dd109476-kube-api-access-fpxng\") pod \"multus-admission-controller-857f4d67dd-gvtf7\" (UID: \"404512a7-dc95-419b-a631-2384dd109476\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1f72075-e615-4332-9439-1aa531ddfccc-profile-collector-cert\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b009296-7e7c-4e1b-bec2-24cf75849218-config-volume\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-plugins-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htkl\" (UniqueName: \"kubernetes.io/projected/60a5f517-7a57-4ff9-b1f7-c8a932a21649-kube-api-access-6htkl\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcz5p\" (UniqueName: \"kubernetes.io/projected/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-kube-api-access-vcz5p\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9098e467-df40-4bb8-bd7c-639d6e59ca82-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mljj6\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-kube-api-access-mljj6\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4204ae-c210-41aa-8e8b-9c908c841143-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/404512a7-dc95-419b-a631-2384dd109476-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gvtf7\" (UID: \"404512a7-dc95-419b-a631-2384dd109476\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dn4d\" (UniqueName: \"kubernetes.io/projected/9098e467-df40-4bb8-bd7c-639d6e59ca82-kube-api-access-2dn4d\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-certs\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/60a5f517-7a57-4ff9-b1f7-c8a932a21649-srv-cert\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jcm\" (UniqueName: \"kubernetes.io/projected/9677d802-8bb2-4791-a2bf-b27de7a948b7-kube-api-access-n8jcm\") pod \"migrator-59844c95c7-b2dsj\" (UID: \"9677d802-8bb2-4791-a2bf-b27de7a948b7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.783998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e563226-525d-4a05-8d5e-ebf573a3d8fe-serving-cert\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngdx9\" (UniqueName: \"kubernetes.io/projected/3b009296-7e7c-4e1b-bec2-24cf75849218-kube-api-access-ngdx9\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9098e467-df40-4bb8-bd7c-639d6e59ca82-trusted-ca\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4z2w\" (UniqueName: \"kubernetes.io/projected/758c2971-70dd-483e-befe-278dd8b2b042-kube-api-access-l4z2w\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f9330f0-4357-4ab3-bca8-245dcabbd614-config-volume\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/758c2971-70dd-483e-befe-278dd8b2b042-webhook-cert\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-oauth-config\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-dir\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvc8m\" (UniqueName: \"kubernetes.io/projected/75e7b399-bd4e-44b1-8c75-f0d81588911d-kube-api-access-zvc8m\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-service-ca\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6lg6\" (UniqueName: \"kubernetes.io/projected/971c0b09-9153-47f0-9bcb-0c4fb6496621-kube-api-access-j6lg6\") pod \"package-server-manager-789f6589d5-whhnl\" (UID: \"971c0b09-9153-47f0-9bcb-0c4fb6496621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-trusted-ca\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnn6q\" (UniqueName: \"kubernetes.io/projected/1e4204ae-c210-41aa-8e8b-9c908c841143-kube-api-access-wnn6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4204ae-c210-41aa-8e8b-9c908c841143-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-node-bootstrap-token\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-certificates\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmx58\" (UniqueName: \"kubernetes.io/projected/2e563226-525d-4a05-8d5e-ebf573a3d8fe-kube-api-access-fmx58\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzr4f\" (UniqueName: \"kubernetes.io/projected/e9a7d78a-ab6f-456c-8433-5c1592d019c6-kube-api-access-tzr4f\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-csi-data-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a875d292-8a92-4500-87a4-84aa079141e5-cert\") pod \"ingress-canary-dnk99\" (UID: \"a875d292-8a92-4500-87a4-84aa079141e5\") " pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-registration-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-serving-cert\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wpgv\" (UniqueName: \"kubernetes.io/projected/046a02a2-14f4-4368-9f21-58d96a510927-kube-api-access-6wpgv\") pod \"auto-csr-approver-29552630-vvkbm\" (UID: \"046a02a2-14f4-4368-9f21-58d96a510927\") " pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/971c0b09-9153-47f0-9bcb-0c4fb6496621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-whhnl\" (UID: \"971c0b09-9153-47f0-9bcb-0c4fb6496621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.784996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-config\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-trusted-ca-bundle\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/758c2971-70dd-483e-befe-278dd8b2b042-tmpfs\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf7gm\" (UniqueName: \"kubernetes.io/projected/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-kube-api-access-pf7gm\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1f72075-e615-4332-9439-1aa531ddfccc-srv-cert\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-policies\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f9330f0-4357-4ab3-bca8-245dcabbd614-metrics-tls\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9098e467-df40-4bb8-bd7c-639d6e59ca82-metrics-tls\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b009296-7e7c-4e1b-bec2-24cf75849218-secret-volume\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e563226-525d-4a05-8d5e-ebf573a3d8fe-config\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-oauth-serving-cert\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhh88\" (UniqueName: \"kubernetes.io/projected/8f9330f0-4357-4ab3-bca8-245dcabbd614-kube-api-access-lhh88\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svt5l\" (UniqueName: \"kubernetes.io/projected/a1f72075-e615-4332-9439-1aa531ddfccc-kube-api-access-svt5l\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.785543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/758c2971-70dd-483e-befe-278dd8b2b042-apiservice-cert\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: E0310 15:51:50.786056 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.286038148 +0000 UTC m=+208.407903825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.802972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9098e467-df40-4bb8-bd7c-639d6e59ca82-trusted-ca\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.806246 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.811356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-tls\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.818428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9098e467-df40-4bb8-bd7c-639d6e59ca82-metrics-tls\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.820803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" event={"ID":"1665ca47-2d24-469b-b53f-4d6b1b5b24c4","Type":"ContainerStarted","Data":"801fd8e041d6ec786cee641cf9f42eb69c779fe69826be5ac46541757e0d5d7c"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.820872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" event={"ID":"1665ca47-2d24-469b-b53f-4d6b1b5b24c4","Type":"ContainerStarted","Data":"73b3b259bb1f44de7dad65b594e51df8639b7ac8b7359f54b5008050fa096d79"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.820889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" event={"ID":"1665ca47-2d24-469b-b53f-4d6b1b5b24c4","Type":"ContainerStarted","Data":"8c42f2ee3b38a3e7162b996fbe662fcb6c12f7257ce6787c17c609eb79ce91cf"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.837699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-bound-sa-token\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.852839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.866642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.867363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljj6\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-kube-api-access-mljj6\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.867609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dn4d\" (UniqueName: \"kubernetes.io/projected/9098e467-df40-4bb8-bd7c-639d6e59ca82-kube-api-access-2dn4d\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.878684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-certificates\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.892562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-trusted-ca\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4204ae-c210-41aa-8e8b-9c908c841143-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-node-bootstrap-token\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmx58\" (UniqueName: \"kubernetes.io/projected/2e563226-525d-4a05-8d5e-ebf573a3d8fe-kube-api-access-fmx58\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzr4f\" (UniqueName: \"kubernetes.io/projected/e9a7d78a-ab6f-456c-8433-5c1592d019c6-kube-api-access-tzr4f\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a875d292-8a92-4500-87a4-84aa079141e5-cert\") pod \"ingress-canary-dnk99\" (UID: \"a875d292-8a92-4500-87a4-84aa079141e5\") " pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-registration-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-csi-data-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/971c0b09-9153-47f0-9bcb-0c4fb6496621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-whhnl\" (UID: \"971c0b09-9153-47f0-9bcb-0c4fb6496621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-serving-cert\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wpgv\" (UniqueName: \"kubernetes.io/projected/046a02a2-14f4-4368-9f21-58d96a510927-kube-api-access-6wpgv\") pod \"auto-csr-approver-29552630-vvkbm\" (UID: \"046a02a2-14f4-4368-9f21-58d96a510927\") " pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-config\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-trusted-ca-bundle\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/758c2971-70dd-483e-befe-278dd8b2b042-tmpfs\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf7gm\" (UniqueName: \"kubernetes.io/projected/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-kube-api-access-pf7gm\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1f72075-e615-4332-9439-1aa531ddfccc-srv-cert\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-policies\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f9330f0-4357-4ab3-bca8-245dcabbd614-metrics-tls\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.895983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b009296-7e7c-4e1b-bec2-24cf75849218-secret-volume\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e563226-525d-4a05-8d5e-ebf573a3d8fe-config\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-oauth-serving-cert\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhh88\" (UniqueName: \"kubernetes.io/projected/8f9330f0-4357-4ab3-bca8-245dcabbd614-kube-api-access-lhh88\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svt5l\" (UniqueName: \"kubernetes.io/projected/a1f72075-e615-4332-9439-1aa531ddfccc-kube-api-access-svt5l\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/758c2971-70dd-483e-befe-278dd8b2b042-apiservice-cert\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8skvg\" (UniqueName: \"kubernetes.io/projected/a875d292-8a92-4500-87a4-84aa079141e5-kube-api-access-8skvg\") pod \"ingress-canary-dnk99\" (UID: \"a875d292-8a92-4500-87a4-84aa079141e5\") " pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/60a5f517-7a57-4ff9-b1f7-c8a932a21649-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-socket-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-mountpoint-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxng\" (UniqueName: \"kubernetes.io/projected/404512a7-dc95-419b-a631-2384dd109476-kube-api-access-fpxng\") pod \"multus-admission-controller-857f4d67dd-gvtf7\" (UID: \"404512a7-dc95-419b-a631-2384dd109476\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1f72075-e615-4332-9439-1aa531ddfccc-profile-collector-cert\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.896366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b009296-7e7c-4e1b-bec2-24cf75849218-config-volume\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.915978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htkl\" (UniqueName: \"kubernetes.io/projected/60a5f517-7a57-4ff9-b1f7-c8a932a21649-kube-api-access-6htkl\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-plugins-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcz5p\" (UniqueName: \"kubernetes.io/projected/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-kube-api-access-vcz5p\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4204ae-c210-41aa-8e8b-9c908c841143-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/404512a7-dc95-419b-a631-2384dd109476-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gvtf7\" (UID: \"404512a7-dc95-419b-a631-2384dd109476\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-certs\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/60a5f517-7a57-4ff9-b1f7-c8a932a21649-srv-cert\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jcm\" (UniqueName: \"kubernetes.io/projected/9677d802-8bb2-4791-a2bf-b27de7a948b7-kube-api-access-n8jcm\") pod \"migrator-59844c95c7-b2dsj\" (UID: \"9677d802-8bb2-4791-a2bf-b27de7a948b7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e563226-525d-4a05-8d5e-ebf573a3d8fe-serving-cert\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngdx9\" (UniqueName: \"kubernetes.io/projected/3b009296-7e7c-4e1b-bec2-24cf75849218-kube-api-access-ngdx9\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4z2w\" (UniqueName: \"kubernetes.io/projected/758c2971-70dd-483e-befe-278dd8b2b042-kube-api-access-l4z2w\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f9330f0-4357-4ab3-bca8-245dcabbd614-config-volume\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/758c2971-70dd-483e-befe-278dd8b2b042-webhook-cert\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-oauth-config\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-dir\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvc8m\" (UniqueName: \"kubernetes.io/projected/75e7b399-bd4e-44b1-8c75-f0d81588911d-kube-api-access-zvc8m\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.916988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-service-ca\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.917024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6lg6\" (UniqueName: \"kubernetes.io/projected/971c0b09-9153-47f0-9bcb-0c4fb6496621-kube-api-access-j6lg6\") pod \"package-server-manager-789f6589d5-whhnl\" (UID: \"971c0b09-9153-47f0-9bcb-0c4fb6496621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.917052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.917096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnn6q\" (UniqueName: \"kubernetes.io/projected/1e4204ae-c210-41aa-8e8b-9c908c841143-kube-api-access-wnn6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.923676 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-serving-cert\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.925662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-policies\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.927241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-mountpoint-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.927970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-socket-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.928449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-registration-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.897990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.898113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-csi-data-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.928843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-plugins-dir\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.900252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" event={"ID":"3df37dd1-ad7d-4cdd-ab55-75a28f7be1fe","Type":"ContainerStarted","Data":"fdc9ccdb5a71e71fdd9a5bc85b27efcc531dc617195062d3fa5e9c89b326c1f7"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.928916 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.909594 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-config\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.940266 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.940941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.941708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e563226-525d-4a05-8d5e-ebf573a3d8fe-config\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.947005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" event={"ID":"9a8388aa-0189-449e-9fbd-71eeb26b1643","Type":"ContainerStarted","Data":"a1e88d665921b45dba549cc9db7e753c954eb02a08936845b8f4f60831238637"} Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.947698 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-9jhsg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.947786 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9jhsg" podUID="9bf60777-23f7-4d99-a70e-a0f4733c54b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.948744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-oauth-serving-cert\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.915826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1f72075-e615-4332-9439-1aa531ddfccc-srv-cert\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.951635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.952340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-trusted-ca-bundle\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.955797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/758c2971-70dd-483e-befe-278dd8b2b042-tmpfs\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.959499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b009296-7e7c-4e1b-bec2-24cf75849218-config-volume\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.965578 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.990226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f9330f0-4357-4ab3-bca8-245dcabbd614-config-volume\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.991545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svt5l\" (UniqueName: \"kubernetes.io/projected/a1f72075-e615-4332-9439-1aa531ddfccc-kube-api-access-svt5l\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.911416 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41310: no serving certificate available for the kubelet" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.910587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e4204ae-c210-41aa-8e8b-9c908c841143-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.992395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-dir\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.994614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.995546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-node-bootstrap-token\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.996006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/758c2971-70dd-483e-befe-278dd8b2b042-apiservice-cert\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:50 crc kubenswrapper[4749]: I0310 15:51:50.996222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.001278 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.501252654 +0000 UTC m=+208.623118511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.010619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-service-ca\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.014600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9098e467-df40-4bb8-bd7c-639d6e59ca82-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qhtxq\" (UID: \"9098e467-df40-4bb8-bd7c-639d6e59ca82\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.014745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.015245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/971c0b09-9153-47f0-9bcb-0c4fb6496621-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-whhnl\" (UID: \"971c0b09-9153-47f0-9bcb-0c4fb6496621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.016526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.016996 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.019856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.021456 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.521434053 +0000 UTC m=+208.643299740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.022557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/60a5f517-7a57-4ff9-b1f7-c8a932a21649-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.028741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1f72075-e615-4332-9439-1aa531ddfccc-profile-collector-cert\") pod \"catalog-operator-68c6474976-49t26\" (UID: \"a1f72075-e615-4332-9439-1aa531ddfccc\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.028848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a875d292-8a92-4500-87a4-84aa079141e5-cert\") pod \"ingress-canary-dnk99\" (UID: \"a875d292-8a92-4500-87a4-84aa079141e5\") " pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.030004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhh88\" (UniqueName: \"kubernetes.io/projected/8f9330f0-4357-4ab3-bca8-245dcabbd614-kube-api-access-lhh88\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.031359 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.035320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.039693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f9330f0-4357-4ab3-bca8-245dcabbd614-metrics-tls\") pod \"dns-default-wbdhx\" (UID: \"8f9330f0-4357-4ab3-bca8-245dcabbd614\") " pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.047324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b009296-7e7c-4e1b-bec2-24cf75849218-secret-volume\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.049006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wpgv\" (UniqueName: \"kubernetes.io/projected/046a02a2-14f4-4368-9f21-58d96a510927-kube-api-access-6wpgv\") pod \"auto-csr-approver-29552630-vvkbm\" (UID: \"046a02a2-14f4-4368-9f21-58d96a510927\") " pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.051505 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41312: no serving certificate available for the kubelet" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.055121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-certs\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.058342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxng\" (UniqueName: \"kubernetes.io/projected/404512a7-dc95-419b-a631-2384dd109476-kube-api-access-fpxng\") pod \"multus-admission-controller-857f4d67dd-gvtf7\" (UID: \"404512a7-dc95-419b-a631-2384dd109476\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.073801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e4204ae-c210-41aa-8e8b-9c908c841143-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.079704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.080063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/60a5f517-7a57-4ff9-b1f7-c8a932a21649-srv-cert\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.081353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmx58\" (UniqueName: \"kubernetes.io/projected/2e563226-525d-4a05-8d5e-ebf573a3d8fe-kube-api-access-fmx58\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.081578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/404512a7-dc95-419b-a631-2384dd109476-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gvtf7\" (UID: \"404512a7-dc95-419b-a631-2384dd109476\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.082008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.086275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e563226-525d-4a05-8d5e-ebf573a3d8fe-serving-cert\") pod \"service-ca-operator-777779d784-czdm4\" (UID: \"2e563226-525d-4a05-8d5e-ebf573a3d8fe\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.090397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/758c2971-70dd-483e-befe-278dd8b2b042-webhook-cert\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.092536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8skvg\" (UniqueName: \"kubernetes.io/projected/a875d292-8a92-4500-87a4-84aa079141e5-kube-api-access-8skvg\") pod \"ingress-canary-dnk99\" (UID: \"a875d292-8a92-4500-87a4-84aa079141e5\") " pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.093773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzr4f\" (UniqueName: \"kubernetes.io/projected/e9a7d78a-ab6f-456c-8433-5c1592d019c6-kube-api-access-tzr4f\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.095602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htkl\" (UniqueName: \"kubernetes.io/projected/60a5f517-7a57-4ff9-b1f7-c8a932a21649-kube-api-access-6htkl\") pod \"olm-operator-6b444d44fb-5sv4c\" (UID: \"60a5f517-7a57-4ff9-b1f7-c8a932a21649\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.108455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcz5p\" (UniqueName: \"kubernetes.io/projected/2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba-kube-api-access-vcz5p\") pod \"csi-hostpathplugin-xlbcp\" (UID: \"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba\") " pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.119085 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.121508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.121682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf7gm\" (UniqueName: \"kubernetes.io/projected/a794e44c-8d0e-457c-9dd5-1b13aaa781d1-kube-api-access-pf7gm\") pod \"machine-config-server-878z5\" (UID: \"a794e44c-8d0e-457c-9dd5-1b13aaa781d1\") " pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.122222 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.622199184 +0000 UTC m=+208.744064871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.129594 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.152314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnn6q\" (UniqueName: \"kubernetes.io/projected/1e4204ae-c210-41aa-8e8b-9c908c841143-kube-api-access-wnn6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-hgdgj\" (UID: \"1e4204ae-c210-41aa-8e8b-9c908c841143\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.181645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.187432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvc8m\" (UniqueName: \"kubernetes.io/projected/75e7b399-bd4e-44b1-8c75-f0d81588911d-kube-api-access-zvc8m\") pod \"oauth-openshift-558db77b4-5dvzk\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.204560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.205260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4z2w\" (UniqueName: \"kubernetes.io/projected/758c2971-70dd-483e-befe-278dd8b2b042-kube-api-access-l4z2w\") pod \"packageserver-d55dfcdfc-vdb7t\" (UID: \"758c2971-70dd-483e-befe-278dd8b2b042\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.205290 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.209559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngdx9\" (UniqueName: \"kubernetes.io/projected/3b009296-7e7c-4e1b-bec2-24cf75849218-kube-api-access-ngdx9\") pod \"collect-profiles-29552625-97htr\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.227354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.228171 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.728151165 +0000 UTC m=+208.850016852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.229698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.230226 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.236684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6lg6\" (UniqueName: \"kubernetes.io/projected/971c0b09-9153-47f0-9bcb-0c4fb6496621-kube-api-access-j6lg6\") pod \"package-server-manager-789f6589d5-whhnl\" (UID: \"971c0b09-9153-47f0-9bcb-0c4fb6496621\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.249341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.267093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dnk99" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.291955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-878z5" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.301695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.310931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.321580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-oauth-config\") pod \"console-f9d7485db-q8p7p\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.321996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.322412 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jcm\" (UniqueName: \"kubernetes.io/projected/9677d802-8bb2-4791-a2bf-b27de7a948b7-kube-api-access-n8jcm\") pod \"migrator-59844c95c7-b2dsj\" (UID: \"9677d802-8bb2-4791-a2bf-b27de7a948b7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.332668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.333104 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.833087986 +0000 UTC m=+208.954953673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.358054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.366404 4749 ???:1] "http: TLS handshake error from 192.168.126.11:41316: no serving certificate available for the kubelet" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.433830 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.434972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.435477 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:51.935454524 +0000 UTC m=+209.057320211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.451442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.467432 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.502239 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dh8rh" podStartSLOduration=170.502211569 podStartE2EDuration="2m50.502211569s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:51.502198358 +0000 UTC m=+208.624064045" watchObservedRunningTime="2026-03-10 15:51:51.502211569 +0000 UTC m=+208.624077246" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.537347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.538787 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.038755617 +0000 UTC m=+209.160621494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.563783 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.638320 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.638563 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.138526401 +0000 UTC m=+209.260392088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.639241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.639847 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.139825768 +0000 UTC m=+209.261691455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.736447 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bswh5" podStartSLOduration=170.7364131 podStartE2EDuration="2m50.7364131s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:51.731082627 +0000 UTC m=+208.852948314" watchObservedRunningTime="2026-03-10 15:51:51.7364131 +0000 UTC m=+208.858278787" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.740816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.741449 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.241423634 +0000 UTC m=+209.363289321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.769676 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" podStartSLOduration=170.769656914 podStartE2EDuration="2m50.769656914s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:51.768234743 +0000 UTC m=+208.890100430" watchObservedRunningTime="2026-03-10 15:51:51.769656914 +0000 UTC m=+208.891522601" Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.842558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.842993 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.342978368 +0000 UTC m=+209.464844055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.880126 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs"] Mar 10 15:51:51 crc kubenswrapper[4749]: I0310 15:51:51.948418 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:51 crc kubenswrapper[4749]: E0310 15:51:51.948754 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.448735553 +0000 UTC m=+209.570601240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.050402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.050725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" event={"ID":"bb477892-41a4-4a6b-a006-d01eaf5bc502","Type":"ContainerStarted","Data":"e295a9b5d6fc4e9898288cef3319dd78ee8433364f210918563e758f97844b02"} Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.050915 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.550895304 +0000 UTC m=+209.672761001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.092467 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-96msp"] Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.110980 4749 generic.go:334] "Generic (PLEG): container finished" podID="5620f312-7196-4598-8c73-361e4784362d" containerID="4db8f132a99780865d0567f176d96b5d48f77645c6a83d4d5d75cc62ff93c6b2" exitCode=0 Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.111109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" event={"ID":"5620f312-7196-4598-8c73-361e4784362d","Type":"ContainerDied","Data":"4db8f132a99780865d0567f176d96b5d48f77645c6a83d4d5d75cc62ff93c6b2"} Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.152603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6g5bk" event={"ID":"18f8edee-4182-4211-9036-f087d4d08f90","Type":"ContainerStarted","Data":"8e83042add1c6813a3148f505972c44a47a179fb9ea84171bfcbe933b4688dbf"} Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.152815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.154333 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.654309252 +0000 UTC m=+209.776174939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.190044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" event={"ID":"9a8388aa-0189-449e-9fbd-71eeb26b1643","Type":"ContainerStarted","Data":"9fc0845a60c6ea1cda31f89c10b713ca35e5b5148bb06bb98122f049cc9e166a"} Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.192947 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ftjlh" podStartSLOduration=171.192934021 podStartE2EDuration="2m51.192934021s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:52.191570061 +0000 UTC m=+209.313435748" watchObservedRunningTime="2026-03-10 15:51:52.192934021 +0000 UTC m=+209.314799708" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.220641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-878z5" event={"ID":"a794e44c-8d0e-457c-9dd5-1b13aaa781d1","Type":"ContainerStarted","Data":"602970b0273480c6c480ca218c39f25109866249470dbef841bf8878a9c6b5f9"} Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.223931 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-5fc2g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.223988 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" podUID="739717e7-ef4a-4032-82be-88a95648f3fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.275818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.305357 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.805321876 +0000 UTC m=+209.927187563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.310974 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s"] Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.322481 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt"] Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.376473 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vlrtg"] Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.379459 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.381001 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.880979957 +0000 UTC m=+210.002845644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.445458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f"] Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.484935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.485709 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:52.985523006 +0000 UTC m=+210.107388693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: W0310 15:51:52.505563 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0fb55b_db13_4353_b4cc_253386c29267.slice/crio-f4bcec5ea34f1593ab300212aad41f38f64ca3bbb988b4c1bbfbcad5653e8b4f WatchSource:0}: Error finding container f4bcec5ea34f1593ab300212aad41f38f64ca3bbb988b4c1bbfbcad5653e8b4f: Status 404 returned error can't find the container with id f4bcec5ea34f1593ab300212aad41f38f64ca3bbb988b4c1bbfbcad5653e8b4f Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.534024 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7"] Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.570988 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" podStartSLOduration=171.570963909 podStartE2EDuration="2m51.570963909s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:52.569926218 +0000 UTC m=+209.691791905" watchObservedRunningTime="2026-03-10 15:51:52.570963909 +0000 UTC m=+209.692829596" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.586102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.587050 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.08703152 +0000 UTC m=+210.208897207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.687847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.688677 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.188645846 +0000 UTC m=+210.310511703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.733704 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" podStartSLOduration=171.733676838 podStartE2EDuration="2m51.733676838s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:52.731688851 +0000 UTC m=+209.853554538" watchObservedRunningTime="2026-03-10 15:51:52.733676838 +0000 UTC m=+209.855542535" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.753412 4749 ???:1] "http: TLS handshake error from 192.168.126.11:33778: no serving certificate available for the kubelet" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.790549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.791079 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.291057624 +0000 UTC m=+210.412923301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.841549 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9jhsg" podStartSLOduration=171.841527162 podStartE2EDuration="2m51.841527162s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:52.789738857 +0000 UTC m=+209.911604554" watchObservedRunningTime="2026-03-10 15:51:52.841527162 +0000 UTC m=+209.963392849" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.893166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.894015 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.393989928 +0000 UTC m=+210.515855795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.939410 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sw57w" podStartSLOduration=171.939384581 podStartE2EDuration="2m51.939384581s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:52.906056424 +0000 UTC m=+210.027922111" watchObservedRunningTime="2026-03-10 15:51:52.939384581 +0000 UTC m=+210.061250268" Mar 10 15:51:52 crc kubenswrapper[4749]: I0310 15:51:52.996118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:52 crc kubenswrapper[4749]: E0310 15:51:52.996658 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.496636224 +0000 UTC m=+210.618501911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.097735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.098150 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.598132976 +0000 UTC m=+210.719998663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.201686 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.202564 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.702544442 +0000 UTC m=+210.824410129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.216315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dc569" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.264460 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n2bhb" podStartSLOduration=172.264433028 podStartE2EDuration="2m52.264433028s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.195166861 +0000 UTC m=+210.317032568" watchObservedRunningTime="2026-03-10 15:51:53.264433028 +0000 UTC m=+210.386298715" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.305647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.306181 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.806155306 +0000 UTC m=+210.928021003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.313027 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8ntsm" podStartSLOduration=172.312984702 podStartE2EDuration="2m52.312984702s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.267746544 +0000 UTC m=+210.389612251" watchObservedRunningTime="2026-03-10 15:51:53.312984702 +0000 UTC m=+210.434850389" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.323412 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.357937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-878z5" event={"ID":"a794e44c-8d0e-457c-9dd5-1b13aaa781d1","Type":"ContainerStarted","Data":"ee7c2515865d625da697ba56a240aaa23bea13fc2209a00942c57b33b15755a4"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.408602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.409606 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.909585204 +0000 UTC m=+211.031450891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.409710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.410146 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:53.91013872 +0000 UTC m=+211.032004407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.417074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" event={"ID":"31a8586b-f1b9-4b1a-b406-6d88768b4cf5","Type":"ContainerStarted","Data":"477a5a63269b80e7ce6d5740539e278028a94c8e344bced2577eb1a31f973a32"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.426070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz"] Mar 10 15:51:53 crc kubenswrapper[4749]: W0310 15:51:53.436068 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad950382_6e38_44e0_b037_7d970035d8ca.slice/crio-89e0fb9d377ff38f0eea64606b79403d8fc9d9968b3e7112dccdd7d0b37177e5 WatchSource:0}: Error finding container 89e0fb9d377ff38f0eea64606b79403d8fc9d9968b3e7112dccdd7d0b37177e5: Status 404 returned error can't find the container with id 89e0fb9d377ff38f0eea64606b79403d8fc9d9968b3e7112dccdd7d0b37177e5 Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.477253 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" podStartSLOduration=172.477218115 podStartE2EDuration="2m52.477218115s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.322952868 +0000 UTC m=+210.444818575" watchObservedRunningTime="2026-03-10 15:51:53.477218115 +0000 UTC m=+210.599083802" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.489735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" event={"ID":"a31d7167-46e8-4c6f-b511-a4a86aa908f2","Type":"ContainerStarted","Data":"e614b0743d87b27e0e5f7a84f9712b60d72ea8b389db3692f6e4d1f772b7f619"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.507447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g6chs"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.513321 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.514805 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.014771013 +0000 UTC m=+211.136636720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.516466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" event={"ID":"f041ecda-15b2-423a-9ceb-0edbf02db58f","Type":"ContainerStarted","Data":"d811398ad908da4afdeb6a48b22e0732cbaab8efdfedb4be617d652ec0a9ca7c"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.535201 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-chntp" podStartSLOduration=172.535176358 podStartE2EDuration="2m52.535176358s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.433859231 +0000 UTC m=+210.555724918" watchObservedRunningTime="2026-03-10 15:51:53.535176358 +0000 UTC m=+210.657042045" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.536139 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.554207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" event={"ID":"3adcc561-fe01-4280-9bdc-d650e3fa8b44","Type":"ContainerStarted","Data":"9b5dfaa46f9bbcb65a9475e65cd8e09f5e6c85899b955d838f76f18fd983e652"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.554308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" event={"ID":"3adcc561-fe01-4280-9bdc-d650e3fa8b44","Type":"ContainerStarted","Data":"865a4c3b27b9162bd05efbb110ecdac7dca55d46068751579adf537a524a7327"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.588137 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pgr7"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.592476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" event={"ID":"9a8388aa-0189-449e-9fbd-71eeb26b1643","Type":"ContainerStarted","Data":"edc620c09ffd47b2f439db40554c3adf537d0609d3f12d33de77fbbb3df7e1f0"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.593519 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-878z5" podStartSLOduration=5.593483921 podStartE2EDuration="5.593483921s" podCreationTimestamp="2026-03-10 15:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.523569655 +0000 UTC m=+210.645435352" watchObservedRunningTime="2026-03-10 15:51:53.593483921 +0000 UTC m=+210.715349608" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.594510 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" podStartSLOduration=172.59450509 podStartE2EDuration="2m52.59450509s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.578287315 +0000 UTC m=+210.700153002" watchObservedRunningTime="2026-03-10 15:51:53.59450509 +0000 UTC m=+210.716370777" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.634479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.635447 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.135425575 +0000 UTC m=+211.257291262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.682812 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.695563 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" podStartSLOduration=172.695529229 podStartE2EDuration="2m52.695529229s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.649461128 +0000 UTC m=+210.771326825" watchObservedRunningTime="2026-03-10 15:51:53.695529229 +0000 UTC m=+210.817394936" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.696716 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-b5rxg" podStartSLOduration=172.696709354 podStartE2EDuration="2m52.696709354s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:53.692838892 +0000 UTC m=+210.814704579" watchObservedRunningTime="2026-03-10 15:51:53.696709354 +0000 UTC m=+210.818575041" Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.698438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6g5bk" event={"ID":"18f8edee-4182-4211-9036-f087d4d08f90","Type":"ContainerStarted","Data":"ddb87c03e21f6e55d6727cda722d42a629dc21765e1baabaf138c1043f543575"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.743839 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.755793 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.255758858 +0000 UTC m=+211.377624545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.760946 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" podUID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" containerName="controller-manager" containerID="cri-o://d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932" gracePeriod=30 Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.762136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-96msp" event={"ID":"9a0fb55b-db13-4353-b4cc-253386c29267","Type":"ContainerStarted","Data":"f4bcec5ea34f1593ab300212aad41f38f64ca3bbb988b4c1bbfbcad5653e8b4f"} Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.762257 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" podUID="b1cf171a-a434-42b9-a974-ee6627c12968" containerName="route-controller-manager" containerID="cri-o://0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a" gracePeriod=30 Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.859734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.863011 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.362989775 +0000 UTC m=+211.484855642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.930793 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.935349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.957011 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-vvkbm"] Mar 10 15:51:53 crc kubenswrapper[4749]: I0310 15:51:53.974694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:53 crc kubenswrapper[4749]: E0310 15:51:53.975693 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.475659168 +0000 UTC m=+211.597524855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.062551 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.062930 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-96msp" podStartSLOduration=173.062905992 podStartE2EDuration="2m53.062905992s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:54.01232124 +0000 UTC m=+211.134186927" watchObservedRunningTime="2026-03-10 15:51:54.062905992 +0000 UTC m=+211.184771689" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.078147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.084812 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.58477669 +0000 UTC m=+211.706642377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.099643 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6g5bk" podStartSLOduration=173.099583165 podStartE2EDuration="2m53.099583165s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:54.047120939 +0000 UTC m=+211.168986616" watchObservedRunningTime="2026-03-10 15:51:54.099583165 +0000 UTC m=+211.221448872" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.104721 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.179363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.180077 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.680055474 +0000 UTC m=+211.801921161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.233750 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wbdhx"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.243113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.274146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.281765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.282758 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.782737651 +0000 UTC m=+211.904603338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.297061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dvzk"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.307032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.313206 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dnk99"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.323960 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gvtf7"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.342219 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q8p7p"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.362314 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj"] Mar 10 15:51:54 crc kubenswrapper[4749]: W0310 15:51:54.370554 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9098e467_df40_4bb8_bd7c_639d6e59ca82.slice/crio-ceee0a4478be28037480849c4c24a9890b7ddd1575e5eb8875e01d09459ebd89 WatchSource:0}: Error finding container ceee0a4478be28037480849c4c24a9890b7ddd1575e5eb8875e01d09459ebd89: Status 404 returned error can't find the container with id ceee0a4478be28037480849c4c24a9890b7ddd1575e5eb8875e01d09459ebd89 Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.385152 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-czdm4"] Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.386754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.387384 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.887350482 +0000 UTC m=+212.009216169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.401575 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" Mar 10 15:51:54 crc kubenswrapper[4749]: W0310 15:51:54.401864 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404512a7_dc95_419b_a631_2384dd109476.slice/crio-4a79b3bf01d50fbc4fcd1aa10159ec3620ca54b7937c6e025635a283da93defa WatchSource:0}: Error finding container 4a79b3bf01d50fbc4fcd1aa10159ec3620ca54b7937c6e025635a283da93defa: Status 404 returned error can't find the container with id 4a79b3bf01d50fbc4fcd1aa10159ec3620ca54b7937c6e025635a283da93defa Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.411844 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xlbcp"] Mar 10 15:51:54 crc kubenswrapper[4749]: W0310 15:51:54.483319 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9677d802_8bb2_4791_a2bf_b27de7a948b7.slice/crio-a77c8c2f86ce01ee16c652247095e2676f342d2b854a6443bb6cfbabe73f9cd3 WatchSource:0}: Error finding container a77c8c2f86ce01ee16c652247095e2676f342d2b854a6443bb6cfbabe73f9cd3: Status 404 returned error can't find the container with id a77c8c2f86ce01ee16c652247095e2676f342d2b854a6443bb6cfbabe73f9cd3 Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.489199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.489729 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:54.98971393 +0000 UTC m=+212.111579617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.574535 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.600350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.607911 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.107874621 +0000 UTC m=+212.229740298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.660597 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.684026 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:51:54 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:51:54 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:51:54 crc kubenswrapper[4749]: healthz check failed Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.684127 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.704967 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-config\") pod \"b1cf171a-a434-42b9-a974-ee6627c12968\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.705080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cf171a-a434-42b9-a974-ee6627c12968-serving-cert\") pod \"b1cf171a-a434-42b9-a974-ee6627c12968\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.705136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-client-ca\") pod \"b1cf171a-a434-42b9-a974-ee6627c12968\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.705224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d78tn\" (UniqueName: \"kubernetes.io/projected/b1cf171a-a434-42b9-a974-ee6627c12968-kube-api-access-d78tn\") pod \"b1cf171a-a434-42b9-a974-ee6627c12968\" (UID: \"b1cf171a-a434-42b9-a974-ee6627c12968\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.705517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.705866 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.205850012 +0000 UTC m=+212.327715699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.706605 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-client-ca" (OuterVolumeSpecName: "client-ca") pod "b1cf171a-a434-42b9-a974-ee6627c12968" (UID: "b1cf171a-a434-42b9-a974-ee6627c12968"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.706902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-config" (OuterVolumeSpecName: "config") pod "b1cf171a-a434-42b9-a974-ee6627c12968" (UID: "b1cf171a-a434-42b9-a974-ee6627c12968"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.728471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cf171a-a434-42b9-a974-ee6627c12968-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b1cf171a-a434-42b9-a974-ee6627c12968" (UID: "b1cf171a-a434-42b9-a974-ee6627c12968"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.739698 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.747173 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1cf171a-a434-42b9-a974-ee6627c12968-kube-api-access-d78tn" (OuterVolumeSpecName: "kube-api-access-d78tn") pod "b1cf171a-a434-42b9-a974-ee6627c12968" (UID: "b1cf171a-a434-42b9-a974-ee6627c12968"). InnerVolumeSpecName "kube-api-access-d78tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.779339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q8p7p" event={"ID":"e9a7d78a-ab6f-456c-8433-5c1592d019c6","Type":"ContainerStarted","Data":"562487b6cbbd4f453d481158bde8038970721f1e66464ac7d379d80fa7026a4d"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.780519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" event={"ID":"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba","Type":"ContainerStarted","Data":"7cfc46dbd6d2ccad77d3230b386871e1b9c1d2784ebdb69fef08ce0e92716d7f"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.791611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dnk99" event={"ID":"a875d292-8a92-4500-87a4-84aa079141e5","Type":"ContainerStarted","Data":"27f48fe14a71b00440d9d2671f813a77163541340d7fd8873958c7c5ffe06125"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.803643 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" event={"ID":"a1f72075-e615-4332-9439-1aa531ddfccc","Type":"ContainerStarted","Data":"15f63c291c5a4b7b554996e14888bb6fa444dc4500639b4f4be504f1a96a8421"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.806493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" event={"ID":"1e4204ae-c210-41aa-8e8b-9c908c841143","Type":"ContainerStarted","Data":"6d26ada454f6a4fdc71d370a689564f2591870c6a7eed8efec3d6d3044b0b2eb"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.807031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.807242 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.307211791 +0000 UTC m=+212.429077478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.807308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.807522 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cf171a-a434-42b9-a974-ee6627c12968-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.807660 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.807686 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d78tn\" (UniqueName: \"kubernetes.io/projected/b1cf171a-a434-42b9-a974-ee6627c12968-kube-api-access-d78tn\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.807707 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cf171a-a434-42b9-a974-ee6627c12968-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.807817 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.307801818 +0000 UTC m=+212.429667505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.808201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" event={"ID":"046a02a2-14f4-4368-9f21-58d96a510927","Type":"ContainerStarted","Data":"fab4ea0f4df8cc6d910a30edf5baa1295c01c244de7cd87252772f4c06132a1b"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.815944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v76rs" event={"ID":"3adcc561-fe01-4280-9bdc-d650e3fa8b44","Type":"ContainerStarted","Data":"ae690552c856031e7531531f18f6cc2ba6a13772f1423ee2f9b1b822fba755e5"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.819367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" event={"ID":"a31d7167-46e8-4c6f-b511-a4a86aa908f2","Type":"ContainerStarted","Data":"4e1d3b59a8f4886aa45ad493dad7d9d840b5d04b72ebb7e3ac62ff588896f2f5"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.825309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" event={"ID":"2e563226-525d-4a05-8d5e-ebf573a3d8fe","Type":"ContainerStarted","Data":"aefd0880303334246783e6cc12a314dce99ec9c4b06ed98419c4077e7a81a1cf"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.838216 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sqn6s" podStartSLOduration=173.83818621 podStartE2EDuration="2m53.83818621s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:54.838002955 +0000 UTC m=+211.959868642" watchObservedRunningTime="2026-03-10 15:51:54.83818621 +0000 UTC m=+211.960051897" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.842146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" event={"ID":"971c0b09-9153-47f0-9bcb-0c4fb6496621","Type":"ContainerStarted","Data":"72967e9fd7e20b23428f1ff5c1ef5d46eded142f28b9694b1e58b5501015d3b3"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.849743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" event={"ID":"9677d802-8bb2-4791-a2bf-b27de7a948b7","Type":"ContainerStarted","Data":"a77c8c2f86ce01ee16c652247095e2676f342d2b854a6443bb6cfbabe73f9cd3"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.852402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" event={"ID":"758c2971-70dd-483e-befe-278dd8b2b042","Type":"ContainerStarted","Data":"965fe00954a2f83b4621e9b9afd87acd51114cddb35749ec1ce458d1c95ed38e"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.855607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" event={"ID":"ad950382-6e38-44e0-b037-7d970035d8ca","Type":"ContainerStarted","Data":"09295c6840bd746ddac02c5f4c0787bb1c6f74c8a96d300b6db91fe9042f2321"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.855659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" event={"ID":"ad950382-6e38-44e0-b037-7d970035d8ca","Type":"ContainerStarted","Data":"89e0fb9d377ff38f0eea64606b79403d8fc9d9968b3e7112dccdd7d0b37177e5"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.857945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" event={"ID":"75e7b399-bd4e-44b1-8c75-f0d81588911d","Type":"ContainerStarted","Data":"34c8c8d435533f2dd600f3199b88ae1ef7665d2165907ca2caf74366aa3837fe"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.859416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" event={"ID":"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7","Type":"ContainerStarted","Data":"e78b7c68bb44a38fceff3c99b4292de31c15b03606e0066b8e0c861310640075"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.873823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" event={"ID":"3b009296-7e7c-4e1b-bec2-24cf75849218","Type":"ContainerStarted","Data":"37f57a0b30ea155d4e7ea5eaf0d9bbd232b9db78c37355ef1c23fc3e23ce953d"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.874995 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pzbmb" podStartSLOduration=173.874961755 podStartE2EDuration="2m53.874961755s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:54.871706702 +0000 UTC m=+211.993572389" watchObservedRunningTime="2026-03-10 15:51:54.874961755 +0000 UTC m=+211.996827442" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.888576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" event={"ID":"60a5f517-7a57-4ff9-b1f7-c8a932a21649","Type":"ContainerStarted","Data":"fddb781518cf964b1780069b91344fe514c0ae9e24f1c0107d4aef17b8f44cf9"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.892191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-92jwt" event={"ID":"f041ecda-15b2-423a-9ceb-0edbf02db58f","Type":"ContainerStarted","Data":"113bb3c2d191971da6ba21e490e4e496a66184759ca3b1b8482a33b4d0328fac"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.908910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-client-ca\") pod \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.909015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrq7t\" (UniqueName: \"kubernetes.io/projected/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-kube-api-access-qrq7t\") pod \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.909147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.909259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-proxy-ca-bundles\") pod \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.909294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-serving-cert\") pod \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.909338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-config\") pod \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\" (UID: \"2a0a1b4c-1c4c-4834-8321-ac4b86673e99\") " Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.909739 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.409697473 +0000 UTC m=+212.531563200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.909904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:54 crc kubenswrapper[4749]: E0310 15:51:54.911478 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.411466003 +0000 UTC m=+212.533331880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.911568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-client-ca" (OuterVolumeSpecName: "client-ca") pod "2a0a1b4c-1c4c-4834-8321-ac4b86673e99" (UID: "2a0a1b4c-1c4c-4834-8321-ac4b86673e99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.911778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2a0a1b4c-1c4c-4834-8321-ac4b86673e99" (UID: "2a0a1b4c-1c4c-4834-8321-ac4b86673e99"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.914228 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-config" (OuterVolumeSpecName: "config") pod "2a0a1b4c-1c4c-4834-8321-ac4b86673e99" (UID: "2a0a1b4c-1c4c-4834-8321-ac4b86673e99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.918923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-kube-api-access-qrq7t" (OuterVolumeSpecName: "kube-api-access-qrq7t") pod "2a0a1b4c-1c4c-4834-8321-ac4b86673e99" (UID: "2a0a1b4c-1c4c-4834-8321-ac4b86673e99"). InnerVolumeSpecName "kube-api-access-qrq7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.923212 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2a0a1b4c-1c4c-4834-8321-ac4b86673e99" (UID: "2a0a1b4c-1c4c-4834-8321-ac4b86673e99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.934591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" event={"ID":"5620f312-7196-4598-8c73-361e4784362d","Type":"ContainerStarted","Data":"0b0fe2271585eb5178771e44334d5c6b5a9137291fe43a09944ef710690a5762"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.948754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbdhx" event={"ID":"8f9330f0-4357-4ab3-bca8-245dcabbd614","Type":"ContainerStarted","Data":"71ef3b19c58bc4becbd69abf94548415f324d0c76365a643ad6af9c6a3d87fa2"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.965828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" event={"ID":"25940433-fa76-4378-87b1-fb387be619ec","Type":"ContainerStarted","Data":"918ca25b8c15e4306d49cc85bd1832688bcbd31d811875b3f25b020ce4e95756"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.968158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" event={"ID":"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b","Type":"ContainerStarted","Data":"50cc9b1123a57004b9fdc5913f93058cd08154562aa4ff9246ecef7ce3aea5ed"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.968195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" event={"ID":"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b","Type":"ContainerStarted","Data":"cc2d468636c47b556f7dd10089a884822439b8b70929106804ddc0df12a8e8bc"} Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.969273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.985423 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" podStartSLOduration=173.985359564 podStartE2EDuration="2m53.985359564s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:54.984125568 +0000 UTC m=+212.105991255" watchObservedRunningTime="2026-03-10 15:51:54.985359564 +0000 UTC m=+212.107225251" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.989169 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2pgr7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.989263 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 10 15:51:54 crc kubenswrapper[4749]: I0310 15:51:54.994873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" event={"ID":"31a8586b-f1b9-4b1a-b406-6d88768b4cf5","Type":"ContainerStarted","Data":"b5f13c1ce5cb8c53c0b1ae925d04246cf5ed5ad15b4b0f2c88cd5b7df372d72c"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.008860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-96msp" event={"ID":"9a0fb55b-db13-4353-b4cc-253386c29267","Type":"ContainerStarted","Data":"54e92e8b9a34d9154142917389004ac04419f350e838cd950286b8b1e47ebfa9"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.011648 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" podStartSLOduration=174.011623377 podStartE2EDuration="2m54.011623377s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:55.010055923 +0000 UTC m=+212.131921610" watchObservedRunningTime="2026-03-10 15:51:55.011623377 +0000 UTC m=+212.133489064" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.012894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.013287 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.513242344 +0000 UTC m=+212.635108031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.013560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.013738 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.013752 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.013762 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrq7t\" (UniqueName: \"kubernetes.io/projected/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-kube-api-access-qrq7t\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.013771 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.013801 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a0a1b4c-1c4c-4834-8321-ac4b86673e99-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.014918 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.514904511 +0000 UTC m=+212.636770198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.032929 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gqv9f" podStartSLOduration=174.032905368 podStartE2EDuration="2m54.032905368s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:55.032641581 +0000 UTC m=+212.154507278" watchObservedRunningTime="2026-03-10 15:51:55.032905368 +0000 UTC m=+212.154771055" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.043543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" event={"ID":"9098e467-df40-4bb8-bd7c-639d6e59ca82","Type":"ContainerStarted","Data":"ceee0a4478be28037480849c4c24a9890b7ddd1575e5eb8875e01d09459ebd89"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.072765 4749 generic.go:334] "Generic (PLEG): container finished" podID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" containerID="d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932" exitCode=0 Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.074158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" event={"ID":"2a0a1b4c-1c4c-4834-8321-ac4b86673e99","Type":"ContainerDied","Data":"d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.074641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" event={"ID":"2a0a1b4c-1c4c-4834-8321-ac4b86673e99","Type":"ContainerDied","Data":"faf1e496257d6dd838f9222b307d5f14e86d206cf72d8b5113f0d9d29e6329a7"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.074760 4749 scope.go:117] "RemoveContainer" containerID="d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.075073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vlrtg" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.080926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" event={"ID":"404512a7-dc95-419b-a631-2384dd109476","Type":"ContainerStarted","Data":"4a79b3bf01d50fbc4fcd1aa10159ec3620ca54b7937c6e025635a283da93defa"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.115706 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6447697678-j6jqc"] Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.116579 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cf171a-a434-42b9-a974-ee6627c12968" containerName="route-controller-manager" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.116703 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cf171a-a434-42b9-a974-ee6627c12968" containerName="route-controller-manager" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.116832 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" containerName="controller-manager" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.117037 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" containerName="controller-manager" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.117462 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cf171a-a434-42b9-a974-ee6627c12968" containerName="route-controller-manager" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.117581 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" containerName="controller-manager" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.118497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.118643 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.119513 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.121044 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.621018126 +0000 UTC m=+212.742883813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.121677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.121710 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.125742 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.126708 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.127057 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.127440 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.127627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.127821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.130308 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6447697678-j6jqc"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.132798 4749 generic.go:334] "Generic (PLEG): container finished" podID="b1cf171a-a434-42b9-a974-ee6627c12968" containerID="0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a" exitCode=0 Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.133097 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.134581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" event={"ID":"b1cf171a-a434-42b9-a974-ee6627c12968","Type":"ContainerDied","Data":"0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.134638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7" event={"ID":"b1cf171a-a434-42b9-a974-ee6627c12968","Type":"ContainerDied","Data":"1e53474cc1d573a0caae08534745a6b01f15aed96daf4be0a36b24607a39cada"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.159341 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.178237 4749 scope.go:117] "RemoveContainer" containerID="d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.186025 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932\": container with ID starting with d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932 not found: ID does not exist" containerID="d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.186085 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932"} err="failed to get container status \"d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932\": rpc error: code = NotFound desc = could not find container \"d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932\": container with ID starting with d57237169158c6dfd6355f0164ee6447060945c3da0f1a05e8f9de04eca82932 not found: ID does not exist" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.186119 4749 scope.go:117] "RemoveContainer" containerID="0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.193039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" event={"ID":"04b00aff-5cc3-4d6f-947f-1f73bc45ad32","Type":"ContainerStarted","Data":"11a3671eb22ecba8e7221bf291be403e416ef5055d4d8a42c4ae71cb1577c4f8"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.193098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" event={"ID":"04b00aff-5cc3-4d6f-947f-1f73bc45ad32","Type":"ContainerStarted","Data":"7bfecaf04b12f24aec10f2e75d792cb94ae50a742d6248a6dde433b458237188"} Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.278609 4749 scope.go:117] "RemoveContainer" containerID="0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.297010 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a\": container with ID starting with 0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a not found: ID does not exist" containerID="0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.297109 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a"} err="failed to get container status \"0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a\": rpc error: code = NotFound desc = could not find container \"0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a\": container with ID starting with 0d9cb6fcff0ba813d0dce308a876dc48aa7fec622add3fda754eb81f7437271a not found: ID does not exist" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.304956 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vlrtg"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-client-ca\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-serving-cert\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngkq\" (UniqueName: \"kubernetes.io/projected/b4cb9fcd-a072-4cfd-8051-c51e012326ce-kube-api-access-xngkq\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4cb9fcd-a072-4cfd-8051-c51e012326ce-serving-cert\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-config\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.350931 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.850902793 +0000 UTC m=+212.972768490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.343820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-config\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.351278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-proxy-ca-bundles\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.351385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-client-ca\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.352898 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmw8\" (UniqueName: \"kubernetes.io/projected/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-kube-api-access-wgmw8\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.356101 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vlrtg"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.378461 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" podStartSLOduration=174.378433873 podStartE2EDuration="2m54.378433873s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:55.344200991 +0000 UTC m=+212.466066688" watchObservedRunningTime="2026-03-10 15:51:55.378433873 +0000 UTC m=+212.500299560" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.395847 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.399480 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6qfk7"] Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.416813 4749 ???:1] "http: TLS handshake error from 192.168.126.11:33784: no serving certificate available for the kubelet" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.454248 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.954213268 +0000 UTC m=+213.076078955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-client-ca\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-serving-cert\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xngkq\" (UniqueName: \"kubernetes.io/projected/b4cb9fcd-a072-4cfd-8051-c51e012326ce-kube-api-access-xngkq\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4cb9fcd-a072-4cfd-8051-c51e012326ce-serving-cert\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454658 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-config\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-config\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-proxy-ca-bundles\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-client-ca\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.454777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmw8\" (UniqueName: \"kubernetes.io/projected/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-kube-api-access-wgmw8\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.455166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-client-ca\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.456733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-proxy-ca-bundles\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.457392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-config\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.457926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-config\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.458128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-client-ca\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.458492 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:55.958430579 +0000 UTC m=+213.080296356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.504238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-serving-cert\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.517854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4cb9fcd-a072-4cfd-8051-c51e012326ce-serving-cert\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.525163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmw8\" (UniqueName: \"kubernetes.io/projected/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-kube-api-access-wgmw8\") pod \"controller-manager-6447697678-j6jqc\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.531886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngkq\" (UniqueName: \"kubernetes.io/projected/b4cb9fcd-a072-4cfd-8051-c51e012326ce-kube-api-access-xngkq\") pod \"route-controller-manager-d678b9987-d4p2t\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.564306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.566210 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.066033577 +0000 UTC m=+213.187899474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.585848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.632156 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0a1b4c-1c4c-4834-8321-ac4b86673e99" path="/var/lib/kubelet/pods/2a0a1b4c-1c4c-4834-8321-ac4b86673e99/volumes" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.635748 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1cf171a-a434-42b9-a974-ee6627c12968" path="/var/lib/kubelet/pods/b1cf171a-a434-42b9-a974-ee6627c12968/volumes" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.658889 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:51:55 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:51:55 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:51:55 crc kubenswrapper[4749]: healthz check failed Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.658952 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.666437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.666852 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.1668355 +0000 UTC m=+213.288701187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.754501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.769345 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.770078 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.270011901 +0000 UTC m=+213.391877598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.770539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.771092 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.271078151 +0000 UTC m=+213.392944028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.874228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.875351 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.375324422 +0000 UTC m=+213.497190109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:55 crc kubenswrapper[4749]: I0310 15:51:55.980266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:55 crc kubenswrapper[4749]: E0310 15:51:55.981356 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.481335984 +0000 UTC m=+213.603201681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.067117 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t"] Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.087778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.088287 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.588267273 +0000 UTC m=+213.710132950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: W0310 15:51:56.131680 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cb9fcd_a072_4cfd_8051_c51e012326ce.slice/crio-1b1e1d4e76fa141dd9ded9c3c0965af6a599e9149d39fffea2a66d3d722b7c33 WatchSource:0}: Error finding container 1b1e1d4e76fa141dd9ded9c3c0965af6a599e9149d39fffea2a66d3d722b7c33: Status 404 returned error can't find the container with id 1b1e1d4e76fa141dd9ded9c3c0965af6a599e9149d39fffea2a66d3d722b7c33 Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.193485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.193995 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.693964556 +0000 UTC m=+213.815830243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.227726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" event={"ID":"3b009296-7e7c-4e1b-bec2-24cf75849218","Type":"ContainerStarted","Data":"fca82aad26fe6fb51e6c3d005288e1c320c91f025525220def58af6067ac90f2"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.242298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" event={"ID":"971c0b09-9153-47f0-9bcb-0c4fb6496621","Type":"ContainerStarted","Data":"00ef7f0b39d121b2d866e684a8cdb3b8b4b6f165b3e225f895c78f9f1428f935"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.242394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" event={"ID":"971c0b09-9153-47f0-9bcb-0c4fb6496621","Type":"ContainerStarted","Data":"d81ff1c4dba7bb5b1097a7276c75ac7f9d22643f90ff11d68a393ed2dbedcdd3"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.243243 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.249116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" event={"ID":"60a5f517-7a57-4ff9-b1f7-c8a932a21649","Type":"ContainerStarted","Data":"0ea6471c5e4d4b8c66357bb98c885fdbbd5fb10ea09e3296ef73039ba325d65b"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.250351 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.254009 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" podStartSLOduration=175.253992329 podStartE2EDuration="2m55.253992329s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.252942179 +0000 UTC m=+213.374807896" watchObservedRunningTime="2026-03-10 15:51:56.253992329 +0000 UTC m=+213.375858016" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.264148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" event={"ID":"404512a7-dc95-419b-a631-2384dd109476","Type":"ContainerStarted","Data":"4945646eaaf424e282d0f03a550aea11d4703d61ae3bf7f9bde2b5fcdd627096"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.269755 4749 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5sv4c container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.269852 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" podUID="60a5f517-7a57-4ff9-b1f7-c8a932a21649" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.278437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" event={"ID":"2e563226-525d-4a05-8d5e-ebf573a3d8fe","Type":"ContainerStarted","Data":"ccbf340ac729343138d594eb1ee96231bd56956b8dae2a141817a1bdd0dfc0c4"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.285112 4749 generic.go:334] "Generic (PLEG): container finished" podID="9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7" containerID="fa4b8710d4410f00e37ee91c90988f7c296a364a42a7c62d0b3c5a8cde602b61" exitCode=0 Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.285232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" event={"ID":"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7","Type":"ContainerDied","Data":"fa4b8710d4410f00e37ee91c90988f7c296a364a42a7c62d0b3c5a8cde602b61"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.295004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.298571 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.798540867 +0000 UTC m=+213.920406554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.303273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.306007 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.805991181 +0000 UTC m=+213.927856868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.332308 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" podStartSLOduration=175.332266736 podStartE2EDuration="2m55.332266736s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.328022223 +0000 UTC m=+213.449887910" watchObservedRunningTime="2026-03-10 15:51:56.332266736 +0000 UTC m=+213.454132443" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.336187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbdhx" event={"ID":"8f9330f0-4357-4ab3-bca8-245dcabbd614","Type":"ContainerStarted","Data":"303711424d34c8b3a4a0f061aa4a219b33f61a63d59e7fef7ef373595b0bca6c"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.340101 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" podStartSLOduration=175.340072109 podStartE2EDuration="2m55.340072109s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.299939637 +0000 UTC m=+213.421805344" watchObservedRunningTime="2026-03-10 15:51:56.340072109 +0000 UTC m=+213.461937796" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.345135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" event={"ID":"9098e467-df40-4bb8-bd7c-639d6e59ca82","Type":"ContainerStarted","Data":"622f759ba1bf8686d1b8ba42363cef1f22fdcfbcba456e62150f3d6cde210da2"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.345821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" event={"ID":"9098e467-df40-4bb8-bd7c-639d6e59ca82","Type":"ContainerStarted","Data":"75e5fc916bbc112f7666d0ccdd2f67d7262fdef23fdef7869ba376a8f587ac40"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.374366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" event={"ID":"75e7b399-bd4e-44b1-8c75-f0d81588911d","Type":"ContainerStarted","Data":"97f2533fcd74a2260f355be2d5797d8343c672bd1c2de1597940db976c086b57"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.375756 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.387558 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5dvzk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.42:6443/healthz\": dial tcp 10.217.0.42:6443: connect: connection refused" start-of-body= Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.387622 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" podUID="75e7b399-bd4e-44b1-8c75-f0d81588911d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.42:6443/healthz\": dial tcp 10.217.0.42:6443: connect: connection refused" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.406982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.407306 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.907254927 +0000 UTC m=+214.029120614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.407754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.410995 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:56.910963294 +0000 UTC m=+214.032828981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.432338 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-czdm4" podStartSLOduration=175.432309156 podStartE2EDuration="2m55.432309156s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.430290909 +0000 UTC m=+213.552156596" watchObservedRunningTime="2026-03-10 15:51:56.432309156 +0000 UTC m=+213.554174843" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.449855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dnk99" event={"ID":"a875d292-8a92-4500-87a4-84aa079141e5","Type":"ContainerStarted","Data":"5d27b33bc36aeb82a6c9d900d6819f9423343583f4fa82bdde6663f92e37c680"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.472877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" event={"ID":"b4cb9fcd-a072-4cfd-8051-c51e012326ce","Type":"ContainerStarted","Data":"1b1e1d4e76fa141dd9ded9c3c0965af6a599e9149d39fffea2a66d3d722b7c33"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.486317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" podStartSLOduration=175.486294315 podStartE2EDuration="2m55.486294315s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.486229783 +0000 UTC m=+213.608095470" watchObservedRunningTime="2026-03-10 15:51:56.486294315 +0000 UTC m=+213.608160002" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.524410 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.525537 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.025456468 +0000 UTC m=+214.147322155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.530511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" event={"ID":"758c2971-70dd-483e-befe-278dd8b2b042","Type":"ContainerStarted","Data":"b4eb86936b762c89e887bbf4beb5efe5e86a65165f8bf735c34345b34855c234"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.531012 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.534199 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vdb7t container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.534263 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" podUID="758c2971-70dd-483e-befe-278dd8b2b042" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.562634 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qhtxq" podStartSLOduration=175.562610325 podStartE2EDuration="2m55.562610325s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.523204355 +0000 UTC m=+213.645070042" watchObservedRunningTime="2026-03-10 15:51:56.562610325 +0000 UTC m=+213.684476012" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.568483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" event={"ID":"9677d802-8bb2-4791-a2bf-b27de7a948b7","Type":"ContainerStarted","Data":"9dccc8dbb85c2c805d81dd0070e8ee972dfdeb3f930ea25d3d4a0d08d496a4ca"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.568538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" event={"ID":"9677d802-8bb2-4791-a2bf-b27de7a948b7","Type":"ContainerStarted","Data":"b57d856d9b472c5e4a4fc3f7a8320a34e96ae302472425f6d528ceac30425fa7"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.611186 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dnk99" podStartSLOduration=8.611145878 podStartE2EDuration="8.611145878s" podCreationTimestamp="2026-03-10 15:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.565869979 +0000 UTC m=+213.687735666" watchObservedRunningTime="2026-03-10 15:51:56.611145878 +0000 UTC m=+213.733011565" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.612030 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" podStartSLOduration=175.612022783 podStartE2EDuration="2m55.612022783s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.607394251 +0000 UTC m=+213.729259938" watchObservedRunningTime="2026-03-10 15:51:56.612022783 +0000 UTC m=+213.733888470" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.613540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" event={"ID":"1e4204ae-c210-41aa-8e8b-9c908c841143","Type":"ContainerStarted","Data":"4d3ac93ac035f0287abdbf39362ae6f8a59a0c3c3410c41b01e54286b5d73e28"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.629590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" event={"ID":"a1f72075-e615-4332-9439-1aa531ddfccc","Type":"ContainerStarted","Data":"abfece613e219ceb5a46211ed58819270f38fe6d0b7102603e7cf663ce3c5d7b"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.630434 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.633982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.637076 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.137054311 +0000 UTC m=+214.258919998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.666477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q8p7p" event={"ID":"e9a7d78a-ab6f-456c-8433-5c1592d019c6","Type":"ContainerStarted","Data":"7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.688151 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:51:56 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:51:56 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:51:56 crc kubenswrapper[4749]: healthz check failed Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.688243 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.701654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" event={"ID":"25940433-fa76-4378-87b1-fb387be619ec","Type":"ContainerStarted","Data":"0697a3253b756b9132be407e117071eef9f01190db80cffd3b0541c9c5199816"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.703984 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.735427 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b2dsj" podStartSLOduration=175.735360252 podStartE2EDuration="2m55.735360252s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.690221637 +0000 UTC m=+213.812087324" watchObservedRunningTime="2026-03-10 15:51:56.735360252 +0000 UTC m=+213.857225939" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.735478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.736631 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.236608088 +0000 UTC m=+214.358473785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.740828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gs6kq" event={"ID":"04b00aff-5cc3-4d6f-947f-1f73bc45ad32","Type":"ContainerStarted","Data":"1e3b433d0d4559e0e3667dce1e889ee075c189fc2ec253edf97b61cf85808e18"} Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.749533 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2pgr7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.749845 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.808803 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q8p7p" podStartSLOduration=175.80878291 podStartE2EDuration="2m55.80878291s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.807525133 +0000 UTC m=+213.929390830" watchObservedRunningTime="2026-03-10 15:51:56.80878291 +0000 UTC m=+213.930648597" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.825809 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hgdgj" podStartSLOduration=175.825770717 podStartE2EDuration="2m55.825770717s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.736744252 +0000 UTC m=+213.858609959" watchObservedRunningTime="2026-03-10 15:51:56.825770717 +0000 UTC m=+213.947636404" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.831401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6447697678-j6jqc"] Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.848893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.851626 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.351607388 +0000 UTC m=+214.473473075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.868753 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4mlvz" podStartSLOduration=175.86872532 podStartE2EDuration="2m55.86872532s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.85308886 +0000 UTC m=+213.974954547" watchObservedRunningTime="2026-03-10 15:51:56.86872532 +0000 UTC m=+213.990591037" Mar 10 15:51:56 crc kubenswrapper[4749]: I0310 15:51:56.956465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:56 crc kubenswrapper[4749]: E0310 15:51:56.956933 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.45691356 +0000 UTC m=+214.578779247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.058399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.058742 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.558728352 +0000 UTC m=+214.680594039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.159719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.160717 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.660697008 +0000 UTC m=+214.782562695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.262655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.263123 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.763105257 +0000 UTC m=+214.884970944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.363760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.364283 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.8642537 +0000 UTC m=+214.986119397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.466183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.467042 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:57.967019279 +0000 UTC m=+215.088884966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.568250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.568831 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.0688055 +0000 UTC m=+215.190671187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.568898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.569480 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.069469568 +0000 UTC m=+215.191335255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.655645 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:51:57 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:51:57 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:51:57 crc kubenswrapper[4749]: healthz check failed Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.655736 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.670065 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.670549 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.170523798 +0000 UTC m=+215.292389485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.755001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" event={"ID":"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7","Type":"ContainerStarted","Data":"eb3f5d350e8691ddb9ad4f74b3d740b0a2aa61af8bd8ea5c7c8e34db7e2a8d2d"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.755055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" event={"ID":"9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7","Type":"ContainerStarted","Data":"e9dc4c6acc3b535da970cc34d89433eab0f4f852c199756d77484988e4a237a9"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.761044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wbdhx" event={"ID":"8f9330f0-4357-4ab3-bca8-245dcabbd614","Type":"ContainerStarted","Data":"5345145bc8b33231b242ceb9220113b41af40f22caaafd74b48660a8dff01552"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.762089 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wbdhx" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.771997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" event={"ID":"404512a7-dc95-419b-a631-2384dd109476","Type":"ContainerStarted","Data":"9bb6cc3f3fac4fb3e90ec55803095ab466dc94f3d223baba9313c5b4db3c1566"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.772123 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.772591 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.272569217 +0000 UTC m=+215.394434984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.778312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" event={"ID":"21717e88-ce9f-4fb7-ab8e-82722f94ca2c","Type":"ContainerStarted","Data":"55925ab8d4c6c825de84ed0c61835ff6e1762a25696a9b30c6fcaabca8678680"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.778367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" event={"ID":"21717e88-ce9f-4fb7-ab8e-82722f94ca2c","Type":"ContainerStarted","Data":"4db642a7985fbb958eac77d090867201019074f00c56b7bbde7680473f773723"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.779540 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.788441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" event={"ID":"b4cb9fcd-a072-4cfd-8051-c51e012326ce","Type":"ContainerStarted","Data":"e9b86f0aee4556b8daef7b67daebe65c0e2a4057c904e6a2574087280223c702"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.788734 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.789077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.802136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" event={"ID":"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba","Type":"ContainerStarted","Data":"ee394d1e5688cbd2af1b4597d4663b7ac6cfaa38f98471aa86c5dd3cb55654e3"} Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.807148 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-49t26" podStartSLOduration=176.807110908 podStartE2EDuration="2m56.807110908s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:56.942128916 +0000 UTC m=+214.063994603" watchObservedRunningTime="2026-03-10 15:51:57.807110908 +0000 UTC m=+214.928976595" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.810761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.824742 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.839927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5sv4c" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.844711 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wbdhx" podStartSLOduration=10.844678076 podStartE2EDuration="10.844678076s" podCreationTimestamp="2026-03-10 15:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:57.812788101 +0000 UTC m=+214.934653798" watchObservedRunningTime="2026-03-10 15:51:57.844678076 +0000 UTC m=+214.966543763" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.846341 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gvtf7" podStartSLOduration=176.846333964 podStartE2EDuration="2m56.846333964s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:57.842797412 +0000 UTC m=+214.964663099" watchObservedRunningTime="2026-03-10 15:51:57.846333964 +0000 UTC m=+214.968199651" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.875276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.889310 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" podStartSLOduration=4.889286916 podStartE2EDuration="4.889286916s" podCreationTimestamp="2026-03-10 15:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:57.887011501 +0000 UTC m=+215.008877198" watchObservedRunningTime="2026-03-10 15:51:57.889286916 +0000 UTC m=+215.011152603" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.908292 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.40824728 +0000 UTC m=+215.530112967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.936769 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" podStartSLOduration=4.936746428 podStartE2EDuration="4.936746428s" podCreationTimestamp="2026-03-10 15:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:57.933989379 +0000 UTC m=+215.055855066" watchObservedRunningTime="2026-03-10 15:51:57.936746428 +0000 UTC m=+215.058612115" Mar 10 15:51:57 crc kubenswrapper[4749]: I0310 15:51:57.983938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:57 crc kubenswrapper[4749]: E0310 15:51:57.984295 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.484283562 +0000 UTC m=+215.606149239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.085155 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.085655 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.5856363 +0000 UTC m=+215.707501987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.188895 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.189884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.190320 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.690303645 +0000 UTC m=+215.812169332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.291587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.292021 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.792000073 +0000 UTC m=+215.913865770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.295731 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-9jhsg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.295791 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9jhsg" podUID="9bf60777-23f7-4d99-a70e-a0f4733c54b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.295825 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-9jhsg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.295848 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9jhsg" podUID="9bf60777-23f7-4d99-a70e-a0f4733c54b1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.309612 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vdb7t" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.393487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.393848 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.893834315 +0000 UTC m=+216.015699992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.494857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.495251 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:58.995232525 +0000 UTC m=+216.117098212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.499342 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9zb9"] Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.500339 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.504547 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.533749 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9zb9"] Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.596445 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-utilities\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.596529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.596683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.596839 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-catalog-content\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.596872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4d2\" (UniqueName: \"kubernetes.io/projected/fea7768f-4827-4630-9169-8b44719ad779-kube-api-access-gr4d2\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.597044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.597557 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.097541481 +0000 UTC m=+216.219407168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.606278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.621430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.654576 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:51:58 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:51:58 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:51:58 crc kubenswrapper[4749]: healthz check failed Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.654676 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.666353 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-btwnr"] Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.667820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.674419 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.691409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btwnr"] Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-catalog-content\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4d2\" (UniqueName: \"kubernetes.io/projected/fea7768f-4827-4630-9169-8b44719ad779-kube-api-access-gr4d2\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-utilities\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.698887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.699689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-catalog-content\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.699911 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.199840937 +0000 UTC m=+216.321706814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.702041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-utilities\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.704784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd3985af-f2c3-4f91-919e-2ea9420418b3-metrics-certs\") pod \"network-metrics-daemon-jpmqp\" (UID: \"cd3985af-f2c3-4f91-919e-2ea9420418b3\") " pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.718600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.720369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.723693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.737990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.748213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4d2\" (UniqueName: \"kubernetes.io/projected/fea7768f-4827-4630-9169-8b44719ad779-kube-api-access-gr4d2\") pod \"certified-operators-h9zb9\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.754179 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpmqp" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.761997 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.801822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-utilities\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.801880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88qz\" (UniqueName: \"kubernetes.io/projected/a19adf76-af03-4d7f-8661-4d93c67fda2e-kube-api-access-h88qz\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.801933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.802060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-catalog-content\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.802360 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.302343888 +0000 UTC m=+216.424209575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.816017 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.838114 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" podStartSLOduration=177.838093604 podStartE2EDuration="2m57.838093604s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:51:58.835680524 +0000 UTC m=+215.957546211" watchObservedRunningTime="2026-03-10 15:51:58.838093604 +0000 UTC m=+215.959959281" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.854297 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-99764"] Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.855371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.893846 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-99764"] Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.904986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.905328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88qz\" (UniqueName: \"kubernetes.io/projected/a19adf76-af03-4d7f-8661-4d93c67fda2e-kube-api-access-h88qz\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.905574 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.405514078 +0000 UTC m=+216.527379765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.905797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.906410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-utilities\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.906554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhgk\" (UniqueName: \"kubernetes.io/projected/b27649fa-b5c8-4aca-9de3-37f171af6e1c-kube-api-access-jdhgk\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.906604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-catalog-content\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.906878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-utilities\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.906941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-catalog-content\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.907343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-catalog-content\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.908741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-utilities\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: E0310 15:51:58.912340 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.412303424 +0000 UTC m=+216.534169281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.971397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88qz\" (UniqueName: \"kubernetes.io/projected/a19adf76-af03-4d7f-8661-4d93c67fda2e-kube-api-access-h88qz\") pod \"community-operators-btwnr\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:58 crc kubenswrapper[4749]: I0310 15:51:58.988078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.014498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.015101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-utilities\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.015150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhgk\" (UniqueName: \"kubernetes.io/projected/b27649fa-b5c8-4aca-9de3-37f171af6e1c-kube-api-access-jdhgk\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.015260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-catalog-content\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.015923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-catalog-content\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.016027 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.51600474 +0000 UTC m=+216.637870427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.016289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-utilities\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.042091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhgk\" (UniqueName: \"kubernetes.io/projected/b27649fa-b5c8-4aca-9de3-37f171af6e1c-kube-api-access-jdhgk\") pod \"certified-operators-99764\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.058413 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.059693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.070139 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.118568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-utilities\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.119926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-catalog-content\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.120084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.120123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bb42\" (UniqueName: \"kubernetes.io/projected/c56f09b3-981c-4a01-8ea3-4417c239cea6-kube-api-access-6bb42\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.120905 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.620885419 +0000 UTC m=+216.742751106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.168646 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.169594 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.179206 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.188751 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.189084 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.199497 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.221672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.221896 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.221929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bb42\" (UniqueName: \"kubernetes.io/projected/c56f09b3-981c-4a01-8ea3-4417c239cea6-kube-api-access-6bb42\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.221950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.222003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-utilities\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.222026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-catalog-content\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.222568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-catalog-content\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.223081 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.723062121 +0000 UTC m=+216.844927808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.223339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-utilities\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.252743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bb42\" (UniqueName: \"kubernetes.io/projected/c56f09b3-981c-4a01-8ea3-4417c239cea6-kube-api-access-6bb42\") pod \"community-operators-2v6nf\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.324203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.324259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.324298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.324491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.324822 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.824805371 +0000 UTC m=+216.946671058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.357311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.400872 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.425429 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.425934 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:51:59.925915892 +0000 UTC m=+217.047781569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.516481 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.529516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.529979 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.029959438 +0000 UTC m=+217.151825125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.633189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.633618 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.133581032 +0000 UTC m=+217.255446719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.633995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.634360 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.134344993 +0000 UTC m=+217.256210680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.684060 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:51:59 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:51:59 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:51:59 crc kubenswrapper[4749]: healthz check failed Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.684124 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.748073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.748902 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.248855729 +0000 UTC m=+217.370721416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.766025 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.766107 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.801982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.851769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.854258 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btwnr"] Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.854603 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.354589904 +0000 UTC m=+217.476455691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.856693 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9zb9"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.861508 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpmqp"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.865508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9a43fb8eeb9e09c2af1e393ed896203592b68698cfa950d517e4f834cd2319cd"} Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.889455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c5430c3db4c1a12c08d2e11cca249728923677b610c8a07eee2576d5ee476965"} Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.889553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0a6ef650b3ffa3c8d3a86d5a5c43bb4d41e8c490ea4c1cc531119c5370dcd31b"} Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.899414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d6fe1b180d8e11c5f3c97e3164f41a1012fad19766ca870ed5fc3ee19493fb68"} Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.906746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-99764"] Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.943021 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zm66b" Mar 10 15:51:59 crc kubenswrapper[4749]: I0310 15:51:59.953232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:51:59 crc kubenswrapper[4749]: E0310 15:51:59.966277 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.466240098 +0000 UTC m=+217.588105785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.058226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.058621 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.558607609 +0000 UTC m=+217.680473296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.087128 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.147613 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xx7p8"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.148938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.157060 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.158990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.159835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6bfw\" (UniqueName: \"kubernetes.io/projected/dec11cc4-b4eb-4b0b-b803-832ac4051974-kube-api-access-z6bfw\") pod \"auto-csr-approver-29552632-xx7p8\" (UID: \"dec11cc4-b4eb-4b0b-b803-832ac4051974\") " pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.160130 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xx7p8"] Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.160343 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.660315657 +0000 UTC m=+217.782181344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.191860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:52:00 crc kubenswrapper[4749]: W0310 15:52:00.229937 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56f09b3_981c_4a01_8ea3_4417c239cea6.slice/crio-307f6797704577cad9e06040019163489a95e22e9f2fcb9f6aac772bf839f75c WatchSource:0}: Error finding container 307f6797704577cad9e06040019163489a95e22e9f2fcb9f6aac772bf839f75c: Status 404 returned error can't find the container with id 307f6797704577cad9e06040019163489a95e22e9f2fcb9f6aac772bf839f75c Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.261077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.261144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6bfw\" (UniqueName: \"kubernetes.io/projected/dec11cc4-b4eb-4b0b-b803-832ac4051974-kube-api-access-z6bfw\") pod \"auto-csr-approver-29552632-xx7p8\" (UID: \"dec11cc4-b4eb-4b0b-b803-832ac4051974\") " pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.261795 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.761756258 +0000 UTC m=+217.883621945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.290503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6bfw\" (UniqueName: \"kubernetes.io/projected/dec11cc4-b4eb-4b0b-b803-832ac4051974-kube-api-access-z6bfw\") pod \"auto-csr-approver-29552632-xx7p8\" (UID: \"dec11cc4-b4eb-4b0b-b803-832ac4051974\") " pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.362725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.363134 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.863113616 +0000 UTC m=+217.984979313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.463734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.464126 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:00.964109254 +0000 UTC m=+218.085974951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.469804 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rg2tg"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.471270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.475804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rg2tg"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.476842 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.513006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.564614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.564901 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.064855765 +0000 UTC m=+218.186721452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.569308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qrh\" (UniqueName: \"kubernetes.io/projected/c08511ac-9832-428c-be08-de0771ee5254-kube-api-access-p2qrh\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.569630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-utilities\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.569946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.570173 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-catalog-content\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.570550 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.070523898 +0000 UTC m=+218.192389775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.593966 4749 ???:1] "http: TLS handshake error from 192.168.126.11:33790: no serving certificate available for the kubelet" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.650613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.654134 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:00 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:00 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:00 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.654198 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.671794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.672159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-catalog-content\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.672241 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.172203247 +0000 UTC m=+218.294068944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.672291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qrh\" (UniqueName: \"kubernetes.io/projected/c08511ac-9832-428c-be08-de0771ee5254-kube-api-access-p2qrh\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.672346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-utilities\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.672613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-catalog-content\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.673574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-utilities\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.708681 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.708762 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.715189 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qrh\" (UniqueName: \"kubernetes.io/projected/c08511ac-9832-428c-be08-de0771ee5254-kube-api-access-p2qrh\") pod \"redhat-marketplace-rg2tg\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.775856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.776496 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.276478349 +0000 UTC m=+218.398344036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.834924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.856326 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dgb8q"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.859867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.878575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.878748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqf9j\" (UniqueName: \"kubernetes.io/projected/ab8ea474-0133-461f-8498-033785eb0a52-kube-api-access-fqf9j\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.878777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-catalog-content\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.878864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-utilities\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.878985 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.37896728 +0000 UTC m=+218.500832967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.885957 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgb8q"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.916924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerStarted","Data":"5f1fc77525173b361f9f0ee26bce57c6ae5d6483a8cc9b5a7d016c88abffdcc2"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.923264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" event={"ID":"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba","Type":"ContainerStarted","Data":"804e089891c467947b9bbe3ebd2d103b592ac600ddbb67045871e10cf859818b"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.937208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a550a0a2ea047c7b5c42fa2341d50aa3a4c9b7b241813cf6b9f344b4ae205112"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.942061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xx7p8"] Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.944983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v6nf" event={"ID":"c56f09b3-981c-4a01-8ea3-4417c239cea6","Type":"ContainerStarted","Data":"307f6797704577cad9e06040019163489a95e22e9f2fcb9f6aac772bf839f75c"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.948820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d5d898cf1638e9cfcdcc77cf0e429cfc93af0560128885977234925ad455cbdb"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.951518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" event={"ID":"cd3985af-f2c3-4f91-919e-2ea9420418b3","Type":"ContainerStarted","Data":"10d0b97c148fc5a8382248a34cb4e2b3b2d2d1d2732a6dc5aa0ad8c301850146"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.955962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerStarted","Data":"2c92c4720be40da5d37886140c6df5544135f1414197d7ffc778057bb0be3db9"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.956022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerStarted","Data":"30f71e3938620f6cc8389c6c836c11999dfc93b9480b095702fa466cec67664a"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.957908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fcd06320-80a5-4a1c-a773-b27e88c39e0f","Type":"ContainerStarted","Data":"5c6150499d63035aa1f556a4118105f861777c19f06643b697e521a8e7a28d21"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.960634 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b009296-7e7c-4e1b-bec2-24cf75849218" containerID="fca82aad26fe6fb51e6c3d005288e1c320c91f025525220def58af6067ac90f2" exitCode=0 Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.960708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" event={"ID":"3b009296-7e7c-4e1b-bec2-24cf75849218","Type":"ContainerDied","Data":"fca82aad26fe6fb51e6c3d005288e1c320c91f025525220def58af6067ac90f2"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.969001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerStarted","Data":"752582a86847c76962e6a06d3fbe063f9b1027144cbb97171c14f9bdac8b7de0"} Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.981178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqf9j\" (UniqueName: \"kubernetes.io/projected/ab8ea474-0133-461f-8498-033785eb0a52-kube-api-access-fqf9j\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.981229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-catalog-content\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.981348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.981524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-utilities\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.982190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-utilities\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:00 crc kubenswrapper[4749]: E0310 15:52:00.983766 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.483743146 +0000 UTC m=+218.605609013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:00 crc kubenswrapper[4749]: I0310 15:52:00.984354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-catalog-content\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.009488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqf9j\" (UniqueName: \"kubernetes.io/projected/ab8ea474-0133-461f-8498-033785eb0a52-kube-api-access-fqf9j\") pod \"redhat-marketplace-dgb8q\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.082953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.083208 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.583164039 +0000 UTC m=+218.705029726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.083562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.084105 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.584095836 +0000 UTC m=+218.705961523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.160590 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rg2tg"] Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.167951 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.185460 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.185808 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.685788204 +0000 UTC m=+218.807653891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: W0310 15:52:01.197771 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc08511ac_9832_428c_be08_de0771ee5254.slice/crio-63fb4fcd242ade22023453c0858d10aa357885be86291acb550ceeb7b15342ff WatchSource:0}: Error finding container 63fb4fcd242ade22023453c0858d10aa357885be86291acb550ceeb7b15342ff: Status 404 returned error can't find the container with id 63fb4fcd242ade22023453c0858d10aa357885be86291acb550ceeb7b15342ff Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.207188 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.245579 4749 patch_prober.go:28] interesting pod/apiserver-76f77b778f-g6chs container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]log ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]etcd ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/max-in-flight-filter ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 10 15:52:01 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 10 15:52:01 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectcache ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-startinformers ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 10 15:52:01 crc kubenswrapper[4749]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 15:52:01 crc kubenswrapper[4749]: livez check failed Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.245657 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" podUID="9bca04d6-8ae5-4a03-b7bb-22a3ae6b5bd7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.286623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.286972 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.786957987 +0000 UTC m=+218.908823674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.307261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.308122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.311832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.313223 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.351029 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.387940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.395460 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.89542058 +0000 UTC m=+219.017286277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.396525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.397066 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.897056387 +0000 UTC m=+219.018922074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.447840 4749 ???:1] "http: TLS handshake error from 192.168.126.11:33794: no serving certificate available for the kubelet" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.469443 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.469523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.474285 4749 patch_prober.go:28] interesting pod/console-f9d7485db-q8p7p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.474349 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q8p7p" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.498368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.499066 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:01.999013363 +0000 UTC m=+219.120879050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.499683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.499720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.499784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.502174 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:02.002149253 +0000 UTC m=+219.124014940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.575955 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgb8q"] Mar 10 15:52:01 crc kubenswrapper[4749]: W0310 15:52:01.584418 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab8ea474_0133_461f_8498_033785eb0a52.slice/crio-4206c37d121978a1471d5e590538fd97cd2b5119aad10997e31b4f46cf32e9bd WatchSource:0}: Error finding container 4206c37d121978a1471d5e590538fd97cd2b5119aad10997e31b4f46cf32e9bd: Status 404 returned error can't find the container with id 4206c37d121978a1471d5e590538fd97cd2b5119aad10997e31b4f46cf32e9bd Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.601577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.601858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.601889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.602011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.602160 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:02.101943237 +0000 UTC m=+219.223808924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.630487 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.654520 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:01 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:01 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:01 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.654938 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.703563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.704194 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:02.20417726 +0000 UTC m=+219.326042947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.746016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.805535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.805745 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:52:02.305707753 +0000 UTC m=+219.427573440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.805850 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T15:52:01.167978143Z","Handler":null,"Name":""} Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.806662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:01 crc kubenswrapper[4749]: E0310 15:52:01.807552 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:52:02.307541837 +0000 UTC m=+219.429407524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m2m4f" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.840524 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.840583 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.879727 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9qnqn"] Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.881630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.884710 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qnqn"] Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.887827 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.910304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.911242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9kj\" (UniqueName: \"kubernetes.io/projected/3484c369-2c9c-48d5-b7be-9dadf06d09ca-kube-api-access-vp9kj\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.911425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-utilities\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.911535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-catalog-content\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:01 crc kubenswrapper[4749]: I0310 15:52:01.990242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.015495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9kj\" (UniqueName: \"kubernetes.io/projected/3484c369-2c9c-48d5-b7be-9dadf06d09ca-kube-api-access-vp9kj\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.015568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.015616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-utilities\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.015657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-catalog-content\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.016843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-catalog-content\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.017488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-utilities\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.024014 4749 generic.go:334] "Generic (PLEG): container finished" podID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerID="c175286ddd09c72b26708698155d7481685999df6138d662ea4bc490018c3434" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.024276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerDied","Data":"c175286ddd09c72b26708698155d7481685999df6138d662ea4bc490018c3434"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.035389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" event={"ID":"cd3985af-f2c3-4f91-919e-2ea9420418b3","Type":"ContainerStarted","Data":"096348063e05c22aa55d63f28b947150ecbdf8eb896f0604e8db417aeb771fa6"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.040012 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.040065 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.044443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9kj\" (UniqueName: \"kubernetes.io/projected/3484c369-2c9c-48d5-b7be-9dadf06d09ca-kube-api-access-vp9kj\") pod \"redhat-operators-9qnqn\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.072295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" event={"ID":"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba","Type":"ContainerStarted","Data":"c1ac7d23896a6ca54b443b0ddc2b36ffc0b33d278ca39ac1dc505850456be89d"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.072351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" event={"ID":"2d82d6fd-bde2-49a0-b54c-f61a1adbd9ba","Type":"ContainerStarted","Data":"f38cef3b62609f2cc8221ea6c2d30c6bcbfc3fc86a515ae6881be825a25eea9e"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.082283 4749 generic.go:334] "Generic (PLEG): container finished" podID="fea7768f-4827-4630-9169-8b44719ad779" containerID="2c92c4720be40da5d37886140c6df5544135f1414197d7ffc778057bb0be3db9" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.082441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerDied","Data":"2c92c4720be40da5d37886140c6df5544135f1414197d7ffc778057bb0be3db9"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.099936 4749 generic.go:334] "Generic (PLEG): container finished" podID="fcd06320-80a5-4a1c-a773-b27e88c39e0f" containerID="6fd581ade1932da27d8d8d987647fce8a44e5a621d4169a3d940664f56dc86e2" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.100270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fcd06320-80a5-4a1c-a773-b27e88c39e0f","Type":"ContainerDied","Data":"6fd581ade1932da27d8d8d987647fce8a44e5a621d4169a3d940664f56dc86e2"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.101270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m2m4f\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.110082 4749 generic.go:334] "Generic (PLEG): container finished" podID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerID="93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.110240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v6nf" event={"ID":"c56f09b3-981c-4a01-8ea3-4417c239cea6","Type":"ContainerDied","Data":"93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.110818 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xlbcp" podStartSLOduration=14.110789208 podStartE2EDuration="14.110789208s" podCreationTimestamp="2026-03-10 15:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:02.098309581 +0000 UTC m=+219.220175268" watchObservedRunningTime="2026-03-10 15:52:02.110789208 +0000 UTC m=+219.232654895" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.120621 4749 generic.go:334] "Generic (PLEG): container finished" podID="c08511ac-9832-428c-be08-de0771ee5254" containerID="61411f915edeb23d34444958efb7feb36f65875f9b942f19a5f7da6c84a6799a" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.120744 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerDied","Data":"61411f915edeb23d34444958efb7feb36f65875f9b942f19a5f7da6c84a6799a"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.120790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerStarted","Data":"63fb4fcd242ade22023453c0858d10aa357885be86291acb550ceeb7b15342ff"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.133517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.144290 4749 generic.go:334] "Generic (PLEG): container finished" podID="ab8ea474-0133-461f-8498-033785eb0a52" containerID="25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.145152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgb8q" event={"ID":"ab8ea474-0133-461f-8498-033785eb0a52","Type":"ContainerDied","Data":"25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.145605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgb8q" event={"ID":"ab8ea474-0133-461f-8498-033785eb0a52","Type":"ContainerStarted","Data":"4206c37d121978a1471d5e590538fd97cd2b5119aad10997e31b4f46cf32e9bd"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.152241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" event={"ID":"dec11cc4-b4eb-4b0b-b803-832ac4051974","Type":"ContainerStarted","Data":"39994db12380b04b2511404953c9312b14e44e45a4a12ead48e9c914d8199c39"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.167130 4749 generic.go:334] "Generic (PLEG): container finished" podID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerID="0fd1cc7dab6111c3999b0d6c0bf61799066e237bf84fef7310ae2043dece1183" exitCode=0 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.169051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerDied","Data":"0fd1cc7dab6111c3999b0d6c0bf61799066e237bf84fef7310ae2043dece1183"} Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.181768 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.266149 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n829z"] Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.281029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.289672 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.336480 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n829z"] Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.383310 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.437039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-utilities\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.437318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-catalog-content\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.437441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdj9\" (UniqueName: \"kubernetes.io/projected/f5296034-50c7-42d3-b2f4-34f1d451be99-kube-api-access-2zdj9\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.539232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdj9\" (UniqueName: \"kubernetes.io/projected/f5296034-50c7-42d3-b2f4-34f1d451be99-kube-api-access-2zdj9\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.539336 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-utilities\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.539363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-catalog-content\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.540809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-catalog-content\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.542092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-utilities\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.561057 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2m4f"] Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.571663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdj9\" (UniqueName: \"kubernetes.io/projected/f5296034-50c7-42d3-b2f4-34f1d451be99-kube-api-access-2zdj9\") pod \"redhat-operators-n829z\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.575899 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:52:02 crc kubenswrapper[4749]: W0310 15:52:02.582240 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod085fc200_fd9e_4e5b_9aef_5a5488c5cb17.slice/crio-c996ecc19690f519946ad87fbc4b740aa743f344151b2405f6abe804584b7d03 WatchSource:0}: Error finding container c996ecc19690f519946ad87fbc4b740aa743f344151b2405f6abe804584b7d03: Status 404 returned error can't find the container with id c996ecc19690f519946ad87fbc4b740aa743f344151b2405f6abe804584b7d03 Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.607874 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.652836 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qnqn"] Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.654735 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:02 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:02 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:02 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.654800 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:02 crc kubenswrapper[4749]: W0310 15:52:02.703230 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3484c369_2c9c_48d5_b7be_9dadf06d09ca.slice/crio-c6aa59165035d6d65502e04a99b34ef8a876c78b9402ecb64ab008921a80432a WatchSource:0}: Error finding container c6aa59165035d6d65502e04a99b34ef8a876c78b9402ecb64ab008921a80432a: Status 404 returned error can't find the container with id c6aa59165035d6d65502e04a99b34ef8a876c78b9402ecb64ab008921a80432a Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.744735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngdx9\" (UniqueName: \"kubernetes.io/projected/3b009296-7e7c-4e1b-bec2-24cf75849218-kube-api-access-ngdx9\") pod \"3b009296-7e7c-4e1b-bec2-24cf75849218\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.744894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b009296-7e7c-4e1b-bec2-24cf75849218-config-volume\") pod \"3b009296-7e7c-4e1b-bec2-24cf75849218\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.744942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b009296-7e7c-4e1b-bec2-24cf75849218-secret-volume\") pod \"3b009296-7e7c-4e1b-bec2-24cf75849218\" (UID: \"3b009296-7e7c-4e1b-bec2-24cf75849218\") " Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.746172 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b009296-7e7c-4e1b-bec2-24cf75849218-config-volume" (OuterVolumeSpecName: "config-volume") pod "3b009296-7e7c-4e1b-bec2-24cf75849218" (UID: "3b009296-7e7c-4e1b-bec2-24cf75849218"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.746481 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b009296-7e7c-4e1b-bec2-24cf75849218-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.751199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b009296-7e7c-4e1b-bec2-24cf75849218-kube-api-access-ngdx9" (OuterVolumeSpecName: "kube-api-access-ngdx9") pod "3b009296-7e7c-4e1b-bec2-24cf75849218" (UID: "3b009296-7e7c-4e1b-bec2-24cf75849218"). InnerVolumeSpecName "kube-api-access-ngdx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.751422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b009296-7e7c-4e1b-bec2-24cf75849218-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3b009296-7e7c-4e1b-bec2-24cf75849218" (UID: "3b009296-7e7c-4e1b-bec2-24cf75849218"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.855685 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngdx9\" (UniqueName: \"kubernetes.io/projected/3b009296-7e7c-4e1b-bec2-24cf75849218-kube-api-access-ngdx9\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:02 crc kubenswrapper[4749]: I0310 15:52:02.855727 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b009296-7e7c-4e1b-bec2-24cf75849218-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.072742 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n829z"] Mar 10 15:52:03 crc kubenswrapper[4749]: W0310 15:52:03.092937 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5296034_50c7_42d3_b2f4_34f1d451be99.slice/crio-2ee8ab63b794ec8a1af2d972801052c8369d089f8a084c8eae2d2d03ea21f94e WatchSource:0}: Error finding container 2ee8ab63b794ec8a1af2d972801052c8369d089f8a084c8eae2d2d03ea21f94e: Status 404 returned error can't find the container with id 2ee8ab63b794ec8a1af2d972801052c8369d089f8a084c8eae2d2d03ea21f94e Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.211477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" event={"ID":"3b009296-7e7c-4e1b-bec2-24cf75849218","Type":"ContainerDied","Data":"37f57a0b30ea155d4e7ea5eaf0d9bbd232b9db78c37355ef1c23fc3e23ce953d"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.211564 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f57a0b30ea155d4e7ea5eaf0d9bbd232b9db78c37355ef1c23fc3e23ce953d" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.211685 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.236467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerStarted","Data":"2ee8ab63b794ec8a1af2d972801052c8369d089f8a084c8eae2d2d03ea21f94e"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.250210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpmqp" event={"ID":"cd3985af-f2c3-4f91-919e-2ea9420418b3","Type":"ContainerStarted","Data":"b3694150fd76f56e264aecc4e06a35b0d2d17b3e00de0e2751a535db6a8dda8d"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.257174 4749 generic.go:334] "Generic (PLEG): container finished" podID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerID="6ea8c2fd2f0f20d37c052a4d7005dfa911fc49f83e8c97bd4f665df6b2376216" exitCode=0 Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.257287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerDied","Data":"6ea8c2fd2f0f20d37c052a4d7005dfa911fc49f83e8c97bd4f665df6b2376216"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.257334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerStarted","Data":"c6aa59165035d6d65502e04a99b34ef8a876c78b9402ecb64ab008921a80432a"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.273286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" event={"ID":"085fc200-fd9e-4e5b-9aef-5a5488c5cb17","Type":"ContainerStarted","Data":"a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.273359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" event={"ID":"085fc200-fd9e-4e5b-9aef-5a5488c5cb17","Type":"ContainerStarted","Data":"c996ecc19690f519946ad87fbc4b740aa743f344151b2405f6abe804584b7d03"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.274405 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.281156 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jpmqp" podStartSLOduration=182.281137802 podStartE2EDuration="3m2.281137802s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:03.279747223 +0000 UTC m=+220.401612910" watchObservedRunningTime="2026-03-10 15:52:03.281137802 +0000 UTC m=+220.403003489" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.304510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8","Type":"ContainerStarted","Data":"63e10d57c883a305604dc617f8b4dc9be84df95a0652dfab41dba15ab2a8ef17"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.304600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8","Type":"ContainerStarted","Data":"afd2592c9aefbf5c6580a5e62ff165ca3595c566d3928b8abea7f8afc88ed2bb"} Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.384865 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" podStartSLOduration=182.384836578 podStartE2EDuration="3m2.384836578s" podCreationTimestamp="2026-03-10 15:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:03.37131852 +0000 UTC m=+220.493184207" watchObservedRunningTime="2026-03-10 15:52:03.384836578 +0000 UTC m=+220.506702265" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.398424 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.398351026 podStartE2EDuration="2.398351026s" podCreationTimestamp="2026-03-10 15:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:03.395452183 +0000 UTC m=+220.517317890" watchObservedRunningTime="2026-03-10 15:52:03.398351026 +0000 UTC m=+220.520216723" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.621341 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.655109 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:03 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:03 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:03 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.655212 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.790783 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.881933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kube-api-access\") pod \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.882088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kubelet-dir\") pod \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\" (UID: \"fcd06320-80a5-4a1c-a773-b27e88c39e0f\") " Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.882630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fcd06320-80a5-4a1c-a773-b27e88c39e0f" (UID: "fcd06320-80a5-4a1c-a773-b27e88c39e0f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.891953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fcd06320-80a5-4a1c-a773-b27e88c39e0f" (UID: "fcd06320-80a5-4a1c-a773-b27e88c39e0f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.984399 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:03 crc kubenswrapper[4749]: I0310 15:52:03.984450 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fcd06320-80a5-4a1c-a773-b27e88c39e0f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.332342 4749 generic.go:334] "Generic (PLEG): container finished" podID="b1ebb843-2c73-4908-abb8-4d7c35a8c0c8" containerID="63e10d57c883a305604dc617f8b4dc9be84df95a0652dfab41dba15ab2a8ef17" exitCode=0 Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.332460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8","Type":"ContainerDied","Data":"63e10d57c883a305604dc617f8b4dc9be84df95a0652dfab41dba15ab2a8ef17"} Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.352779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fcd06320-80a5-4a1c-a773-b27e88c39e0f","Type":"ContainerDied","Data":"5c6150499d63035aa1f556a4118105f861777c19f06643b697e521a8e7a28d21"} Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.352854 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c6150499d63035aa1f556a4118105f861777c19f06643b697e521a8e7a28d21" Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.352840 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.363070 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerID="e79e714b546820829bf59539aedd0158696a451f1a0f0d779efe7b9d5dbe05c5" exitCode=0 Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.364087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerDied","Data":"e79e714b546820829bf59539aedd0158696a451f1a0f0d779efe7b9d5dbe05c5"} Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.654104 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:04 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:04 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:04 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:04 crc kubenswrapper[4749]: I0310 15:52:04.654231 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:05 crc kubenswrapper[4749]: I0310 15:52:05.654708 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:05 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:05 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:05 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:05 crc kubenswrapper[4749]: I0310 15:52:05.655274 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:05 crc kubenswrapper[4749]: I0310 15:52:05.710559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:52:05 crc kubenswrapper[4749]: I0310 15:52:05.715613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-g6chs" Mar 10 15:52:06 crc kubenswrapper[4749]: I0310 15:52:06.246186 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wbdhx" Mar 10 15:52:06 crc kubenswrapper[4749]: I0310 15:52:06.658517 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:06 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:06 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:06 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:06 crc kubenswrapper[4749]: I0310 15:52:06.658606 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:07 crc kubenswrapper[4749]: I0310 15:52:07.653015 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:07 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:07 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:07 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:07 crc kubenswrapper[4749]: I0310 15:52:07.653161 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:08 crc kubenswrapper[4749]: I0310 15:52:08.342252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9jhsg" Mar 10 15:52:08 crc kubenswrapper[4749]: I0310 15:52:08.652780 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:08 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:08 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:08 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:08 crc kubenswrapper[4749]: I0310 15:52:08.654607 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:09 crc kubenswrapper[4749]: I0310 15:52:09.652346 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:09 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:09 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:09 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:09 crc kubenswrapper[4749]: I0310 15:52:09.652434 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:10 crc kubenswrapper[4749]: I0310 15:52:10.651904 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:10 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:10 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:10 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:10 crc kubenswrapper[4749]: I0310 15:52:10.652062 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:10 crc kubenswrapper[4749]: I0310 15:52:10.871113 4749 ???:1] "http: TLS handshake error from 192.168.126.11:42544: no serving certificate available for the kubelet" Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.469623 4749 patch_prober.go:28] interesting pod/console-f9d7485db-q8p7p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.469712 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q8p7p" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.29:8443/health\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.652432 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:11 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:11 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:11 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.652506 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.795293 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6447697678-j6jqc"] Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.795703 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" containerID="cri-o://55925ab8d4c6c825de84ed0c61835ff6e1762a25696a9b30c6fcaabca8678680" gracePeriod=30 Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.805559 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t"] Mar 10 15:52:11 crc kubenswrapper[4749]: I0310 15:52:11.806459 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" containerID="cri-o://e9b86f0aee4556b8daef7b67daebe65c0e2a4057c904e6a2574087280223c702" gracePeriod=30 Mar 10 15:52:12 crc kubenswrapper[4749]: I0310 15:52:12.456924 4749 generic.go:334] "Generic (PLEG): container finished" podID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerID="55925ab8d4c6c825de84ed0c61835ff6e1762a25696a9b30c6fcaabca8678680" exitCode=0 Mar 10 15:52:12 crc kubenswrapper[4749]: I0310 15:52:12.457091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" event={"ID":"21717e88-ce9f-4fb7-ab8e-82722f94ca2c","Type":"ContainerDied","Data":"55925ab8d4c6c825de84ed0c61835ff6e1762a25696a9b30c6fcaabca8678680"} Mar 10 15:52:12 crc kubenswrapper[4749]: I0310 15:52:12.464419 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerID="e9b86f0aee4556b8daef7b67daebe65c0e2a4057c904e6a2574087280223c702" exitCode=0 Mar 10 15:52:12 crc kubenswrapper[4749]: I0310 15:52:12.464435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" event={"ID":"b4cb9fcd-a072-4cfd-8051-c51e012326ce","Type":"ContainerDied","Data":"e9b86f0aee4556b8daef7b67daebe65c0e2a4057c904e6a2574087280223c702"} Mar 10 15:52:12 crc kubenswrapper[4749]: I0310 15:52:12.653718 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:12 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:12 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:12 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:12 crc kubenswrapper[4749]: I0310 15:52:12.653823 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:13 crc kubenswrapper[4749]: I0310 15:52:13.655211 4749 patch_prober.go:28] interesting pod/router-default-5444994796-6g5bk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:52:13 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Mar 10 15:52:13 crc kubenswrapper[4749]: [+]process-running ok Mar 10 15:52:13 crc kubenswrapper[4749]: healthz check failed Mar 10 15:52:13 crc kubenswrapper[4749]: I0310 15:52:13.655585 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6g5bk" podUID="18f8edee-4182-4211-9036-f087d4d08f90" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:52:14 crc kubenswrapper[4749]: I0310 15:52:14.653394 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:52:14 crc kubenswrapper[4749]: I0310 15:52:14.656707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6g5bk" Mar 10 15:52:15 crc kubenswrapper[4749]: I0310 15:52:15.587850 4749 patch_prober.go:28] interesting pod/route-controller-manager-d678b9987-d4p2t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 10 15:52:15 crc kubenswrapper[4749]: I0310 15:52:15.587934 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 10 15:52:15 crc kubenswrapper[4749]: I0310 15:52:15.755841 4749 patch_prober.go:28] interesting pod/controller-manager-6447697678-j6jqc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 10 15:52:15 crc kubenswrapper[4749]: I0310 15:52:15.755929 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 10 15:52:20 crc kubenswrapper[4749]: I0310 15:52:20.980613 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:52:20 crc kubenswrapper[4749]: I0310 15:52:20.981030 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:52:21 crc kubenswrapper[4749]: I0310 15:52:21.501924 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:52:21 crc kubenswrapper[4749]: I0310 15:52:21.508224 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 15:52:22 crc kubenswrapper[4749]: I0310 15:52:22.143701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:52:25 crc kubenswrapper[4749]: I0310 15:52:25.587610 4749 patch_prober.go:28] interesting pod/route-controller-manager-d678b9987-d4p2t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 10 15:52:25 crc kubenswrapper[4749]: I0310 15:52:25.588192 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 10 15:52:25 crc kubenswrapper[4749]: I0310 15:52:25.756079 4749 patch_prober.go:28] interesting pod/controller-manager-6447697678-j6jqc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 10 15:52:25 crc kubenswrapper[4749]: I0310 15:52:25.756189 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.457098 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-whhnl" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.499419 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:52:31 crc kubenswrapper[4749]: E0310 15:52:31.499818 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd06320-80a5-4a1c-a773-b27e88c39e0f" containerName="pruner" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.499832 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd06320-80a5-4a1c-a773-b27e88c39e0f" containerName="pruner" Mar 10 15:52:31 crc kubenswrapper[4749]: E0310 15:52:31.499847 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b009296-7e7c-4e1b-bec2-24cf75849218" containerName="collect-profiles" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.499855 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b009296-7e7c-4e1b-bec2-24cf75849218" containerName="collect-profiles" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.499969 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b009296-7e7c-4e1b-bec2-24cf75849218" containerName="collect-profiles" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.499989 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd06320-80a5-4a1c-a773-b27e88c39e0f" containerName="pruner" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.500667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.504264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.619578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.619634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.721159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.721240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.721449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.766647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:31 crc kubenswrapper[4749]: I0310 15:52:31.824190 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.167308 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.329774 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kubelet-dir\") pod \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.329909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b1ebb843-2c73-4908-abb8-4d7c35a8c0c8" (UID: "b1ebb843-2c73-4908-abb8-4d7c35a8c0c8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.330563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kube-api-access\") pod \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\" (UID: \"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8\") " Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.331152 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.336120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b1ebb843-2c73-4908-abb8-4d7c35a8c0c8" (UID: "b1ebb843-2c73-4908-abb8-4d7c35a8c0c8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.432628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ebb843-2c73-4908-abb8-4d7c35a8c0c8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.600351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1ebb843-2c73-4908-abb8-4d7c35a8c0c8","Type":"ContainerDied","Data":"afd2592c9aefbf5c6580a5e62ff165ca3595c566d3928b8abea7f8afc88ed2bb"} Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.600411 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd2592c9aefbf5c6580a5e62ff165ca3595c566d3928b8abea7f8afc88ed2bb" Mar 10 15:52:32 crc kubenswrapper[4749]: I0310 15:52:32.600446 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.683340 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:52:35 crc kubenswrapper[4749]: E0310 15:52:35.684113 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ebb843-2c73-4908-abb8-4d7c35a8c0c8" containerName="pruner" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.684134 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ebb843-2c73-4908-abb8-4d7c35a8c0c8" containerName="pruner" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.684288 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ebb843-2c73-4908-abb8-4d7c35a8c0c8" containerName="pruner" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.684932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.710761 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.781772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.781842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.781907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-var-lock\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.883884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-var-lock\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.884074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.884144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.884099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-var-lock\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.884311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:35 crc kubenswrapper[4749]: I0310 15:52:35.904238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:36 crc kubenswrapper[4749]: I0310 15:52:36.018602 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:52:36 crc kubenswrapper[4749]: I0310 15:52:36.588138 4749 patch_prober.go:28] interesting pod/route-controller-manager-d678b9987-d4p2t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:52:36 crc kubenswrapper[4749]: I0310 15:52:36.588947 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:52:36 crc kubenswrapper[4749]: I0310 15:52:36.755117 4749 patch_prober.go:28] interesting pod/controller-manager-6447697678-j6jqc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:52:36 crc kubenswrapper[4749]: I0310 15:52:36.755238 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.731609 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:52:38 crc kubenswrapper[4749]: E0310 15:52:38.760177 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 15:52:38 crc kubenswrapper[4749]: E0310 15:52:38.760403 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:52:38 crc kubenswrapper[4749]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 15:52:38 crc kubenswrapper[4749]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6wpgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552630-vvkbm_openshift-infra(046a02a2-14f4-4368-9f21-58d96a510927): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 15:52:38 crc kubenswrapper[4749]: > logger="UnhandledError" Mar 10 15:52:38 crc kubenswrapper[4749]: E0310 15:52:38.761791 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" podUID="046a02a2-14f4-4368-9f21-58d96a510927" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.874229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.880315 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.909245 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8f5f46b85-qlp4l"] Mar 10 15:52:38 crc kubenswrapper[4749]: E0310 15:52:38.909584 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.909599 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" Mar 10 15:52:38 crc kubenswrapper[4749]: E0310 15:52:38.909619 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.909626 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.909736 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" containerName="route-controller-manager" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.909746 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" containerName="controller-manager" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.910235 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:38 crc kubenswrapper[4749]: I0310 15:52:38.924127 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8f5f46b85-qlp4l"] Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.032636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-serving-cert\") pod \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.032710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-config\") pod \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.032904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4cb9fcd-a072-4cfd-8051-c51e012326ce-serving-cert\") pod \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.032949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-proxy-ca-bundles\") pod \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmw8\" (UniqueName: \"kubernetes.io/projected/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-kube-api-access-wgmw8\") pod \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xngkq\" (UniqueName: \"kubernetes.io/projected/b4cb9fcd-a072-4cfd-8051-c51e012326ce-kube-api-access-xngkq\") pod \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033194 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-client-ca\") pod \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-config\") pod \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\" (UID: \"b4cb9fcd-a072-4cfd-8051-c51e012326ce\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-client-ca\") pod \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\" (UID: \"21717e88-ce9f-4fb7-ab8e-82722f94ca2c\") " Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-proxy-ca-bundles\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-serving-cert\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rspns\" (UniqueName: \"kubernetes.io/projected/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-kube-api-access-rspns\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-config\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.033764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-client-ca\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.034334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "21717e88-ce9f-4fb7-ab8e-82722f94ca2c" (UID: "21717e88-ce9f-4fb7-ab8e-82722f94ca2c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.034643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-config" (OuterVolumeSpecName: "config") pod "21717e88-ce9f-4fb7-ab8e-82722f94ca2c" (UID: "21717e88-ce9f-4fb7-ab8e-82722f94ca2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.034826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-client-ca" (OuterVolumeSpecName: "client-ca") pod "21717e88-ce9f-4fb7-ab8e-82722f94ca2c" (UID: "21717e88-ce9f-4fb7-ab8e-82722f94ca2c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.035040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-config" (OuterVolumeSpecName: "config") pod "b4cb9fcd-a072-4cfd-8051-c51e012326ce" (UID: "b4cb9fcd-a072-4cfd-8051-c51e012326ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.035121 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "b4cb9fcd-a072-4cfd-8051-c51e012326ce" (UID: "b4cb9fcd-a072-4cfd-8051-c51e012326ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.040156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4cb9fcd-a072-4cfd-8051-c51e012326ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b4cb9fcd-a072-4cfd-8051-c51e012326ce" (UID: "b4cb9fcd-a072-4cfd-8051-c51e012326ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.040223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cb9fcd-a072-4cfd-8051-c51e012326ce-kube-api-access-xngkq" (OuterVolumeSpecName: "kube-api-access-xngkq") pod "b4cb9fcd-a072-4cfd-8051-c51e012326ce" (UID: "b4cb9fcd-a072-4cfd-8051-c51e012326ce"). InnerVolumeSpecName "kube-api-access-xngkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.041092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-kube-api-access-wgmw8" (OuterVolumeSpecName: "kube-api-access-wgmw8") pod "21717e88-ce9f-4fb7-ab8e-82722f94ca2c" (UID: "21717e88-ce9f-4fb7-ab8e-82722f94ca2c"). InnerVolumeSpecName "kube-api-access-wgmw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.042008 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "21717e88-ce9f-4fb7-ab8e-82722f94ca2c" (UID: "21717e88-ce9f-4fb7-ab8e-82722f94ca2c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.135668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rspns\" (UniqueName: \"kubernetes.io/projected/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-kube-api-access-rspns\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.135748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-config\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.135770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-client-ca\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.135813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-proxy-ca-bundles\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.135839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-serving-cert\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.137459 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmw8\" (UniqueName: \"kubernetes.io/projected/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-kube-api-access-wgmw8\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.137476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-config\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138082 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xngkq\" (UniqueName: \"kubernetes.io/projected/b4cb9fcd-a072-4cfd-8051-c51e012326ce-kube-api-access-xngkq\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138151 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138165 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4cb9fcd-a072-4cfd-8051-c51e012326ce-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138183 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138197 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138206 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138220 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4cb9fcd-a072-4cfd-8051-c51e012326ce-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.138260 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21717e88-ce9f-4fb7-ab8e-82722f94ca2c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.140268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-proxy-ca-bundles\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.140838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-client-ca\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.142771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-serving-cert\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.154862 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rspns\" (UniqueName: \"kubernetes.io/projected/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-kube-api-access-rspns\") pod \"controller-manager-8f5f46b85-qlp4l\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.228755 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:39 crc kubenswrapper[4749]: E0310 15:52:39.523286 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 15:52:39 crc kubenswrapper[4749]: E0310 15:52:39.523541 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:52:39 crc kubenswrapper[4749]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 15:52:39 crc kubenswrapper[4749]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6bfw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552632-xx7p8_openshift-infra(dec11cc4-b4eb-4b0b-b803-832ac4051974): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 15:52:39 crc kubenswrapper[4749]: > logger="UnhandledError" Mar 10 15:52:39 crc kubenswrapper[4749]: E0310 15:52:39.525133 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" podUID="dec11cc4-b4eb-4b0b-b803-832ac4051974" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.657366 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.657937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6447697678-j6jqc" event={"ID":"21717e88-ce9f-4fb7-ab8e-82722f94ca2c","Type":"ContainerDied","Data":"4db642a7985fbb958eac77d090867201019074f00c56b7bbde7680473f773723"} Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.658704 4749 scope.go:117] "RemoveContainer" containerID="55925ab8d4c6c825de84ed0c61835ff6e1762a25696a9b30c6fcaabca8678680" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.660241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" event={"ID":"b4cb9fcd-a072-4cfd-8051-c51e012326ce","Type":"ContainerDied","Data":"1b1e1d4e76fa141dd9ded9c3c0965af6a599e9149d39fffea2a66d3d722b7c33"} Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.660341 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t" Mar 10 15:52:39 crc kubenswrapper[4749]: E0310 15:52:39.661472 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" podUID="dec11cc4-b4eb-4b0b-b803-832ac4051974" Mar 10 15:52:39 crc kubenswrapper[4749]: E0310 15:52:39.661808 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" podUID="046a02a2-14f4-4368-9f21-58d96a510927" Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.719754 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6447697678-j6jqc"] Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.728133 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6447697678-j6jqc"] Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.731523 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t"] Mar 10 15:52:39 crc kubenswrapper[4749]: I0310 15:52:39.734301 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d678b9987-d4p2t"] Mar 10 15:52:41 crc kubenswrapper[4749]: I0310 15:52:41.615519 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21717e88-ce9f-4fb7-ab8e-82722f94ca2c" path="/var/lib/kubelet/pods/21717e88-ce9f-4fb7-ab8e-82722f94ca2c/volumes" Mar 10 15:52:41 crc kubenswrapper[4749]: I0310 15:52:41.616275 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cb9fcd-a072-4cfd-8051-c51e012326ce" path="/var/lib/kubelet/pods/b4cb9fcd-a072-4cfd-8051-c51e012326ce/volumes" Mar 10 15:52:42 crc kubenswrapper[4749]: I0310 15:52:42.517665 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.132144 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h"] Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.136562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.141870 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.143088 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.143348 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.143439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.143684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.145343 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h"] Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.182167 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.318746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7bfc\" (UniqueName: \"kubernetes.io/projected/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-kube-api-access-j7bfc\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.319196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-client-ca\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.319242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-config\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.319316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-serving-cert\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.420547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7bfc\" (UniqueName: \"kubernetes.io/projected/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-kube-api-access-j7bfc\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.420666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-client-ca\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.420717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-config\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.420763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-serving-cert\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.422653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-client-ca\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.422946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-config\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.438315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-serving-cert\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.445533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7bfc\" (UniqueName: \"kubernetes.io/projected/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-kube-api-access-j7bfc\") pod \"route-controller-manager-68f4499ccf-g6z8h\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:43 crc kubenswrapper[4749]: I0310 15:52:43.505894 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:44 crc kubenswrapper[4749]: E0310 15:52:44.544094 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 15:52:44 crc kubenswrapper[4749]: E0310 15:52:44.544879 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr4d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h9zb9_openshift-marketplace(fea7768f-4827-4630-9169-8b44719ad779): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:44 crc kubenswrapper[4749]: E0310 15:52:44.546192 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h9zb9" podUID="fea7768f-4827-4630-9169-8b44719ad779" Mar 10 15:52:44 crc kubenswrapper[4749]: E0310 15:52:44.623578 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 15:52:44 crc kubenswrapper[4749]: E0310 15:52:44.623815 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jdhgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-99764_openshift-marketplace(b27649fa-b5c8-4aca-9de3-37f171af6e1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:44 crc kubenswrapper[4749]: E0310 15:52:44.625051 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-99764" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" Mar 10 15:52:48 crc kubenswrapper[4749]: I0310 15:52:48.418076 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dvzk"] Mar 10 15:52:50 crc kubenswrapper[4749]: E0310 15:52:50.680643 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 15:52:50 crc kubenswrapper[4749]: E0310 15:52:50.681367 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqf9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dgb8q_openshift-marketplace(ab8ea474-0133-461f-8498-033785eb0a52): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:50 crc kubenswrapper[4749]: E0310 15:52:50.682594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dgb8q" podUID="ab8ea474-0133-461f-8498-033785eb0a52" Mar 10 15:52:50 crc kubenswrapper[4749]: E0310 15:52:50.724972 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 15:52:50 crc kubenswrapper[4749]: E0310 15:52:50.725217 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p2qrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rg2tg_openshift-marketplace(c08511ac-9832-428c-be08-de0771ee5254): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:50 crc kubenswrapper[4749]: E0310 15:52:50.728786 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rg2tg" podUID="c08511ac-9832-428c-be08-de0771ee5254" Mar 10 15:52:50 crc kubenswrapper[4749]: I0310 15:52:50.980636 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:52:50 crc kubenswrapper[4749]: I0310 15:52:50.980719 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:52:51 crc kubenswrapper[4749]: I0310 15:52:51.856941 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34266: no serving certificate available for the kubelet" Mar 10 15:52:51 crc kubenswrapper[4749]: E0310 15:52:51.862854 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 15:52:51 crc kubenswrapper[4749]: E0310 15:52:51.863291 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vp9kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9qnqn_openshift-marketplace(3484c369-2c9c-48d5-b7be-9dadf06d09ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:51 crc kubenswrapper[4749]: E0310 15:52:51.864899 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9qnqn" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.420732 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-99764" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.421191 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h9zb9" podUID="fea7768f-4827-4630-9169-8b44719ad779" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.503147 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.503508 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6bb42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2v6nf_openshift-marketplace(c56f09b3-981c-4a01-8ea3-4417c239cea6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.504770 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2v6nf" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.508557 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rg2tg" podUID="c08511ac-9832-428c-be08-de0771ee5254" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.508596 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dgb8q" podUID="ab8ea474-0133-461f-8498-033785eb0a52" Mar 10 15:52:52 crc kubenswrapper[4749]: I0310 15:52:52.520612 4749 scope.go:117] "RemoveContainer" containerID="e9b86f0aee4556b8daef7b67daebe65c0e2a4057c904e6a2574087280223c702" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.522119 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.522421 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h88qz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-btwnr_openshift-marketplace(a19adf76-af03-4d7f-8661-4d93c67fda2e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.524000 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-btwnr" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" Mar 10 15:52:52 crc kubenswrapper[4749]: I0310 15:52:52.766067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8","Type":"ContainerStarted","Data":"48c82c9b9c5a99623a5f8694dcf9a08621193ab6f2c01315b25d710eb9cb07cd"} Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.768528 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-btwnr" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.774142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2v6nf" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" Mar 10 15:52:52 crc kubenswrapper[4749]: E0310 15:52:52.774492 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9qnqn" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" Mar 10 15:52:52 crc kubenswrapper[4749]: I0310 15:52:52.810208 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.066781 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h"] Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.070018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8f5f46b85-qlp4l"] Mar 10 15:52:53 crc kubenswrapper[4749]: W0310 15:52:53.087973 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0e4ac1_6c65_49f2_ad39_b1aeadf5a0be.slice/crio-de5d3f088228a5521e555483dd53ea4fbb9bff13ef3dbc7fad15dd972157e326 WatchSource:0}: Error finding container de5d3f088228a5521e555483dd53ea4fbb9bff13ef3dbc7fad15dd972157e326: Status 404 returned error can't find the container with id de5d3f088228a5521e555483dd53ea4fbb9bff13ef3dbc7fad15dd972157e326 Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.772838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8","Type":"ContainerStarted","Data":"699f41ce7842098fe18179df8f7ed9ea908a1021ae67c15e49937e5cb0bfb2ac"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.775778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df","Type":"ContainerStarted","Data":"00f3345507ae95828c376056f82d3f59525a74ff0bbcedc9332b197bd4d91b13"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.775825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df","Type":"ContainerStarted","Data":"dce4195d0c301a33f8ffbbe07338d3532a73a341831b0da9a0a3d49288ee519d"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.779891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" event={"ID":"046a02a2-14f4-4368-9f21-58d96a510927","Type":"ContainerStarted","Data":"cdf2b1e8906734bb83dd1444e0fcf3c085b3c1b29c40074eebfdc4e2bbf25dea"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.781897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" event={"ID":"a1f37ab7-5993-4a93-848c-2bf22bd81cb1","Type":"ContainerStarted","Data":"1d176850f9f3de7dd40b6daa19a31defe8b22459d0371b9742f1718f9af5924e"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.782012 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.782027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" event={"ID":"a1f37ab7-5993-4a93-848c-2bf22bd81cb1","Type":"ContainerStarted","Data":"f4c2f3fd41680055d3639fd0b3d066e2b93e20dbab5349cba94dc36a4938d848"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.784514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" event={"ID":"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be","Type":"ContainerStarted","Data":"d6435a4f5d56d6ef06bbf7476e8155aeb3664bb4427e545f6bd2838bbd7530e5"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.784568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" event={"ID":"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be","Type":"ContainerStarted","Data":"de5d3f088228a5521e555483dd53ea4fbb9bff13ef3dbc7fad15dd972157e326"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.784729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.826626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerStarted","Data":"72346773b4f848face18c55b34cc402ef70e709b5b0d6c1ac1ea0618d289831c"} Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.849285 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" podStartSLOduration=115.217721748 podStartE2EDuration="2m53.849258142s" podCreationTimestamp="2026-03-10 15:50:00 +0000 UTC" firstStartedPulling="2026-03-10 15:51:54.062268864 +0000 UTC m=+211.184134551" lastFinishedPulling="2026-03-10 15:52:52.693805258 +0000 UTC m=+269.815670945" observedRunningTime="2026-03-10 15:52:53.846410691 +0000 UTC m=+270.968276388" watchObservedRunningTime="2026-03-10 15:52:53.849258142 +0000 UTC m=+270.971123829" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.850178 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=22.850173188 podStartE2EDuration="22.850173188s" podCreationTimestamp="2026-03-10 15:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:53.823149714 +0000 UTC m=+270.945015401" watchObservedRunningTime="2026-03-10 15:52:53.850173188 +0000 UTC m=+270.972038875" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.850670 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.885337 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=18.885314695 podStartE2EDuration="18.885314695s" podCreationTimestamp="2026-03-10 15:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:53.876088031 +0000 UTC m=+270.997953718" watchObservedRunningTime="2026-03-10 15:52:53.885314695 +0000 UTC m=+271.007180382" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.916334 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" podStartSLOduration=22.916317564 podStartE2EDuration="22.916317564s" podCreationTimestamp="2026-03-10 15:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:53.911039032 +0000 UTC m=+271.032904729" watchObservedRunningTime="2026-03-10 15:52:53.916317564 +0000 UTC m=+271.038183251" Mar 10 15:52:53 crc kubenswrapper[4749]: I0310 15:52:53.944011 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" podStartSLOduration=22.943994387 podStartE2EDuration="22.943994387s" podCreationTimestamp="2026-03-10 15:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:52:53.942414101 +0000 UTC m=+271.064279788" watchObservedRunningTime="2026-03-10 15:52:53.943994387 +0000 UTC m=+271.065860074" Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.050096 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.181586 4749 csr.go:261] certificate signing request csr-xvdsv is approved, waiting to be issued Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.196522 4749 csr.go:257] certificate signing request csr-xvdsv is issued Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.837678 4749 generic.go:334] "Generic (PLEG): container finished" podID="046a02a2-14f4-4368-9f21-58d96a510927" containerID="cdf2b1e8906734bb83dd1444e0fcf3c085b3c1b29c40074eebfdc4e2bbf25dea" exitCode=0 Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.838007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" event={"ID":"046a02a2-14f4-4368-9f21-58d96a510927","Type":"ContainerDied","Data":"cdf2b1e8906734bb83dd1444e0fcf3c085b3c1b29c40074eebfdc4e2bbf25dea"} Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.842434 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerID="72346773b4f848face18c55b34cc402ef70e709b5b0d6c1ac1ea0618d289831c" exitCode=0 Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.842517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerDied","Data":"72346773b4f848face18c55b34cc402ef70e709b5b0d6c1ac1ea0618d289831c"} Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.846618 4749 generic.go:334] "Generic (PLEG): container finished" podID="abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8" containerID="699f41ce7842098fe18179df8f7ed9ea908a1021ae67c15e49937e5cb0bfb2ac" exitCode=0 Mar 10 15:52:54 crc kubenswrapper[4749]: I0310 15:52:54.847622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8","Type":"ContainerDied","Data":"699f41ce7842098fe18179df8f7ed9ea908a1021ae67c15e49937e5cb0bfb2ac"} Mar 10 15:52:55 crc kubenswrapper[4749]: I0310 15:52:55.198099 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-28 13:03:38.67707376 +0000 UTC Mar 10 15:52:55 crc kubenswrapper[4749]: I0310 15:52:55.198176 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6309h10m43.478901419s for next certificate rotation Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.198407 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-25 10:44:29.637509894 +0000 UTC Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.198922 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6954h51m33.438590961s for next certificate rotation Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.199434 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.204478 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.364000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kube-api-access\") pod \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.364088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wpgv\" (UniqueName: \"kubernetes.io/projected/046a02a2-14f4-4368-9f21-58d96a510927-kube-api-access-6wpgv\") pod \"046a02a2-14f4-4368-9f21-58d96a510927\" (UID: \"046a02a2-14f4-4368-9f21-58d96a510927\") " Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.364119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kubelet-dir\") pod \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\" (UID: \"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8\") " Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.364731 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8" (UID: "abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.373144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8" (UID: "abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.375258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046a02a2-14f4-4368-9f21-58d96a510927-kube-api-access-6wpgv" (OuterVolumeSpecName: "kube-api-access-6wpgv") pod "046a02a2-14f4-4368-9f21-58d96a510927" (UID: "046a02a2-14f4-4368-9f21-58d96a510927"). InnerVolumeSpecName "kube-api-access-6wpgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.465882 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.465946 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wpgv\" (UniqueName: \"kubernetes.io/projected/046a02a2-14f4-4368-9f21-58d96a510927-kube-api-access-6wpgv\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.465965 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.859787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8","Type":"ContainerDied","Data":"48c82c9b9c5a99623a5f8694dcf9a08621193ab6f2c01315b25d710eb9cb07cd"} Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.860073 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c82c9b9c5a99623a5f8694dcf9a08621193ab6f2c01315b25d710eb9cb07cd" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.859841 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.861150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" event={"ID":"046a02a2-14f4-4368-9f21-58d96a510927","Type":"ContainerDied","Data":"fab4ea0f4df8cc6d910a30edf5baa1295c01c244de7cd87252772f4c06132a1b"} Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.861199 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab4ea0f4df8cc6d910a30edf5baa1295c01c244de7cd87252772f4c06132a1b" Mar 10 15:52:56 crc kubenswrapper[4749]: I0310 15:52:56.861205 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-vvkbm" Mar 10 15:52:57 crc kubenswrapper[4749]: I0310 15:52:57.871658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" event={"ID":"dec11cc4-b4eb-4b0b-b803-832ac4051974","Type":"ContainerStarted","Data":"7f9621f29ed38d6ea4048c4fa33039fff9f4e6fdc3241491f571eeae8a8a90c7"} Mar 10 15:52:58 crc kubenswrapper[4749]: I0310 15:52:58.885729 4749 generic.go:334] "Generic (PLEG): container finished" podID="dec11cc4-b4eb-4b0b-b803-832ac4051974" containerID="7f9621f29ed38d6ea4048c4fa33039fff9f4e6fdc3241491f571eeae8a8a90c7" exitCode=0 Mar 10 15:52:58 crc kubenswrapper[4749]: I0310 15:52:58.885880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" event={"ID":"dec11cc4-b4eb-4b0b-b803-832ac4051974","Type":"ContainerDied","Data":"7f9621f29ed38d6ea4048c4fa33039fff9f4e6fdc3241491f571eeae8a8a90c7"} Mar 10 15:52:58 crc kubenswrapper[4749]: I0310 15:52:58.888892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerStarted","Data":"3a01343d650fc3d5ef536514b1ee7e77aa51c7b3d3328e142d6c7e27b8ce4d74"} Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.231453 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.265035 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n829z" podStartSLOduration=4.423624285 podStartE2EDuration="58.265006411s" podCreationTimestamp="2026-03-10 15:52:02 +0000 UTC" firstStartedPulling="2026-03-10 15:52:04.368035831 +0000 UTC m=+221.489901528" lastFinishedPulling="2026-03-10 15:52:58.209417947 +0000 UTC m=+275.331283654" observedRunningTime="2026-03-10 15:52:58.918262486 +0000 UTC m=+276.040128193" watchObservedRunningTime="2026-03-10 15:53:00.265006411 +0000 UTC m=+277.386872098" Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.429204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6bfw\" (UniqueName: \"kubernetes.io/projected/dec11cc4-b4eb-4b0b-b803-832ac4051974-kube-api-access-z6bfw\") pod \"dec11cc4-b4eb-4b0b-b803-832ac4051974\" (UID: \"dec11cc4-b4eb-4b0b-b803-832ac4051974\") " Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.437327 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec11cc4-b4eb-4b0b-b803-832ac4051974-kube-api-access-z6bfw" (OuterVolumeSpecName: "kube-api-access-z6bfw") pod "dec11cc4-b4eb-4b0b-b803-832ac4051974" (UID: "dec11cc4-b4eb-4b0b-b803-832ac4051974"). InnerVolumeSpecName "kube-api-access-z6bfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.530925 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6bfw\" (UniqueName: \"kubernetes.io/projected/dec11cc4-b4eb-4b0b-b803-832ac4051974-kube-api-access-z6bfw\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.904569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" event={"ID":"dec11cc4-b4eb-4b0b-b803-832ac4051974","Type":"ContainerDied","Data":"39994db12380b04b2511404953c9312b14e44e45a4a12ead48e9c914d8199c39"} Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.904639 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39994db12380b04b2511404953c9312b14e44e45a4a12ead48e9c914d8199c39" Mar 10 15:53:00 crc kubenswrapper[4749]: I0310 15:53:00.904717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xx7p8" Mar 10 15:53:02 crc kubenswrapper[4749]: I0310 15:53:02.608423 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:53:02 crc kubenswrapper[4749]: I0310 15:53:02.609714 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:53:04 crc kubenswrapper[4749]: I0310 15:53:04.015174 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n829z" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="registry-server" probeResult="failure" output=< Mar 10 15:53:04 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 15:53:04 crc kubenswrapper[4749]: > Mar 10 15:53:11 crc kubenswrapper[4749]: I0310 15:53:11.776682 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8f5f46b85-qlp4l"] Mar 10 15:53:11 crc kubenswrapper[4749]: I0310 15:53:11.777295 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" podUID="a1f37ab7-5993-4a93-848c-2bf22bd81cb1" containerName="controller-manager" containerID="cri-o://1d176850f9f3de7dd40b6daa19a31defe8b22459d0371b9742f1718f9af5924e" gracePeriod=30 Mar 10 15:53:11 crc kubenswrapper[4749]: I0310 15:53:11.877994 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h"] Mar 10 15:53:11 crc kubenswrapper[4749]: I0310 15:53:11.878718 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" podUID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" containerName="route-controller-manager" containerID="cri-o://d6435a4f5d56d6ef06bbf7476e8155aeb3664bb4427e545f6bd2838bbd7530e5" gracePeriod=30 Mar 10 15:53:12 crc kubenswrapper[4749]: I0310 15:53:12.682407 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:53:12 crc kubenswrapper[4749]: I0310 15:53:12.727078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:53:12 crc kubenswrapper[4749]: I0310 15:53:12.915310 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n829z"] Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.039566 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" containerID="d6435a4f5d56d6ef06bbf7476e8155aeb3664bb4427e545f6bd2838bbd7530e5" exitCode=0 Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.039648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" event={"ID":"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be","Type":"ContainerDied","Data":"d6435a4f5d56d6ef06bbf7476e8155aeb3664bb4427e545f6bd2838bbd7530e5"} Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.041198 4749 generic.go:334] "Generic (PLEG): container finished" podID="a1f37ab7-5993-4a93-848c-2bf22bd81cb1" containerID="1d176850f9f3de7dd40b6daa19a31defe8b22459d0371b9742f1718f9af5924e" exitCode=0 Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.041254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" event={"ID":"a1f37ab7-5993-4a93-848c-2bf22bd81cb1","Type":"ContainerDied","Data":"1d176850f9f3de7dd40b6daa19a31defe8b22459d0371b9742f1718f9af5924e"} Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.478442 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" podUID="75e7b399-bd4e-44b1-8c75-f0d81588911d" containerName="oauth-openshift" containerID="cri-o://97f2533fcd74a2260f355be2d5797d8343c672bd1c2de1597940db976c086b57" gracePeriod=15 Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.507159 4749 patch_prober.go:28] interesting pod/route-controller-manager-68f4499ccf-g6z8h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 10 15:53:13 crc kubenswrapper[4749]: I0310 15:53:13.507240 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" podUID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.071336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerStarted","Data":"dc04157356b58916cc79bd9f71b43abe27a69fc6eabca519d0226abce170be29"} Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.074003 4749 generic.go:334] "Generic (PLEG): container finished" podID="75e7b399-bd4e-44b1-8c75-f0d81588911d" containerID="97f2533fcd74a2260f355be2d5797d8343c672bd1c2de1597940db976c086b57" exitCode=0 Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.074109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" event={"ID":"75e7b399-bd4e-44b1-8c75-f0d81588911d","Type":"ContainerDied","Data":"97f2533fcd74a2260f355be2d5797d8343c672bd1c2de1597940db976c086b57"} Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.076558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerStarted","Data":"e08c68561fa1b193d25da3eebedc3905fd4b46309346b21b509b357f3de63e94"} Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.079062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerStarted","Data":"e61340a77749010fde4d7b1b845714b9993675ba75f8bb4749d9c77d5ca1b7ac"} Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.079282 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n829z" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="registry-server" containerID="cri-o://3a01343d650fc3d5ef536514b1ee7e77aa51c7b3d3328e142d6c7e27b8ce4d74" gracePeriod=2 Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.088202 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.155143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.169466 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bd8ffd566-xh9xh"] Mar 10 15:53:14 crc kubenswrapper[4749]: E0310 15:53:14.172529 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e7b399-bd4e-44b1-8c75-f0d81588911d" containerName="oauth-openshift" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.172562 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e7b399-bd4e-44b1-8c75-f0d81588911d" containerName="oauth-openshift" Mar 10 15:53:14 crc kubenswrapper[4749]: E0310 15:53:14.172601 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8" containerName="pruner" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.172608 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8" containerName="pruner" Mar 10 15:53:14 crc kubenswrapper[4749]: E0310 15:53:14.172620 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" containerName="route-controller-manager" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.172627 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" containerName="route-controller-manager" Mar 10 15:53:14 crc kubenswrapper[4749]: E0310 15:53:14.172649 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec11cc4-b4eb-4b0b-b803-832ac4051974" containerName="oc" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.172853 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec11cc4-b4eb-4b0b-b803-832ac4051974" containerName="oc" Mar 10 15:53:14 crc kubenswrapper[4749]: E0310 15:53:14.172937 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046a02a2-14f4-4368-9f21-58d96a510927" containerName="oc" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.172947 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="046a02a2-14f4-4368-9f21-58d96a510927" containerName="oc" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.173226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-policies\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.173325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-provider-selection\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.173851 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" containerName="route-controller-manager" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.173924 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-login\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.173961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-ocp-branding-template\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.174038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-error\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175365 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec11cc4-b4eb-4b0b-b803-832ac4051974" containerName="oc" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175427 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="046a02a2-14f4-4368-9f21-58d96a510927" containerName="oc" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175449 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe52ac0-3e72-4c3e-a3ad-6f07a3ae6ee8" containerName="pruner" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175673 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e7b399-bd4e-44b1-8c75-f0d81588911d" containerName="oauth-openshift" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-cliconfig\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.175770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-dir\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.176234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-idp-0-file-data\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.176296 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-service-ca\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.176331 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-trusted-ca-bundle\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.179118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.179221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.181710 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.183382 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvc8m\" (UniqueName: \"kubernetes.io/projected/75e7b399-bd4e-44b1-8c75-f0d81588911d-kube-api-access-zvc8m\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.184731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-session\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.184969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-serving-cert\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.185525 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-router-certs\") pod \"75e7b399-bd4e-44b1-8c75-f0d81588911d\" (UID: \"75e7b399-bd4e-44b1-8c75-f0d81588911d\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.186571 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.186599 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.186617 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.186619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.186628 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.191009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.191916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e7b399-bd4e-44b1-8c75-f0d81588911d-kube-api-access-zvc8m" (OuterVolumeSpecName: "kube-api-access-zvc8m") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "kube-api-access-zvc8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.195506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.200589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.202279 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.226707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.227159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.227535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.227855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.227958 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bd8ffd566-xh9xh"] Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.229109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "75e7b399-bd4e-44b1-8c75-f0d81588911d" (UID: "75e7b399-bd4e-44b1-8c75-f0d81588911d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.231033 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.290931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-client-ca\") pod \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.291354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7bfc\" (UniqueName: \"kubernetes.io/projected/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-kube-api-access-j7bfc\") pod \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.291609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-config\") pod \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.291778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-serving-cert\") pod \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\" (UID: \"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.291988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-client-ca" (OuterVolumeSpecName: "client-ca") pod "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" (UID: "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-audit-dir\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-config" (OuterVolumeSpecName: "config") pod "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" (UID: "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.292942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293072 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-error\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8m92\" (UniqueName: \"kubernetes.io/projected/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-kube-api-access-f8m92\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-audit-policies\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-session\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-login\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.293965 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.294059 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.294223 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.294312 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvc8m\" (UniqueName: \"kubernetes.io/projected/75e7b399-bd4e-44b1-8c75-f0d81588911d-kube-api-access-zvc8m\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.294379 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.295697 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.295772 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.295839 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.295897 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.295954 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.296016 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.296094 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/75e7b399-bd4e-44b1-8c75-f0d81588911d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.295707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-kube-api-access-j7bfc" (OuterVolumeSpecName: "kube-api-access-j7bfc") pod "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" (UID: "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be"). InnerVolumeSpecName "kube-api-access-j7bfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.301867 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" (UID: "ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.399892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rspns\" (UniqueName: \"kubernetes.io/projected/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-kube-api-access-rspns\") pod \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.399972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-client-ca\") pod \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.399999 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-serving-cert\") pod \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-proxy-ca-bundles\") pod \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-config\") pod \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\" (UID: \"a1f37ab7-5993-4a93-848c-2bf22bd81cb1\") " Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-error\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8m92\" (UniqueName: \"kubernetes.io/projected/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-kube-api-access-f8m92\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-audit-policies\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-session\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-login\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-audit-dir\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.400774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.401370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-client-ca" (OuterVolumeSpecName: "client-ca") pod "a1f37ab7-5993-4a93-848c-2bf22bd81cb1" (UID: "a1f37ab7-5993-4a93-848c-2bf22bd81cb1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.402025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-audit-policies\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.401464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a1f37ab7-5993-4a93-848c-2bf22bd81cb1" (UID: "a1f37ab7-5993-4a93-848c-2bf22bd81cb1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.401480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-config" (OuterVolumeSpecName: "config") pod "a1f37ab7-5993-4a93-848c-2bf22bd81cb1" (UID: "a1f37ab7-5993-4a93-848c-2bf22bd81cb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.401818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-audit-dir\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.401814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.402510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.402544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.402847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.403100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.403878 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7bfc\" (UniqueName: \"kubernetes.io/projected/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-kube-api-access-j7bfc\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.403910 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.403925 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.403939 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.403954 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.406235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-kube-api-access-rspns" (OuterVolumeSpecName: "kube-api-access-rspns") pod "a1f37ab7-5993-4a93-848c-2bf22bd81cb1" (UID: "a1f37ab7-5993-4a93-848c-2bf22bd81cb1"). InnerVolumeSpecName "kube-api-access-rspns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.406698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.406747 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-login\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.406802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.407109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a1f37ab7-5993-4a93-848c-2bf22bd81cb1" (UID: "a1f37ab7-5993-4a93-848c-2bf22bd81cb1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.407593 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-error\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.407696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.408898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-session\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.409702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.411844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.421344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8m92\" (UniqueName: \"kubernetes.io/projected/bd13c040-8dc7-4134-8aaa-7c84f164ca0d-kube-api-access-f8m92\") pod \"oauth-openshift-bd8ffd566-xh9xh\" (UID: \"bd13c040-8dc7-4134-8aaa-7c84f164ca0d\") " pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.455480 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.505602 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rspns\" (UniqueName: \"kubernetes.io/projected/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-kube-api-access-rspns\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.505660 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1f37ab7-5993-4a93-848c-2bf22bd81cb1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:14 crc kubenswrapper[4749]: I0310 15:53:14.900631 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bd8ffd566-xh9xh"] Mar 10 15:53:14 crc kubenswrapper[4749]: W0310 15:53:14.942058 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd13c040_8dc7_4134_8aaa_7c84f164ca0d.slice/crio-0ea0d69f13a25d79e0ed7ecfe78f3ea86f9e7fbd26cbf1ed5bee69b5f885a7cf WatchSource:0}: Error finding container 0ea0d69f13a25d79e0ed7ecfe78f3ea86f9e7fbd26cbf1ed5bee69b5f885a7cf: Status 404 returned error can't find the container with id 0ea0d69f13a25d79e0ed7ecfe78f3ea86f9e7fbd26cbf1ed5bee69b5f885a7cf Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.089984 4749 generic.go:334] "Generic (PLEG): container finished" podID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerID="ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.090097 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v6nf" event={"ID":"c56f09b3-981c-4a01-8ea3-4417c239cea6","Type":"ContainerDied","Data":"ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.093516 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.093720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8f5f46b85-qlp4l" event={"ID":"a1f37ab7-5993-4a93-848c-2bf22bd81cb1","Type":"ContainerDied","Data":"f4c2f3fd41680055d3639fd0b3d066e2b93e20dbab5349cba94dc36a4938d848"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.093805 4749 scope.go:117] "RemoveContainer" containerID="1d176850f9f3de7dd40b6daa19a31defe8b22459d0371b9742f1718f9af5924e" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.101418 4749 generic.go:334] "Generic (PLEG): container finished" podID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerID="e61340a77749010fde4d7b1b845714b9993675ba75f8bb4749d9c77d5ca1b7ac" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.101499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerDied","Data":"e61340a77749010fde4d7b1b845714b9993675ba75f8bb4749d9c77d5ca1b7ac"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.105050 4749 generic.go:334] "Generic (PLEG): container finished" podID="fea7768f-4827-4630-9169-8b44719ad779" containerID="a1f1dbb53913431fa45ad2b611f6ddc746262d2610464e9882e2d1d8f7ee9de2" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.105111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerDied","Data":"a1f1dbb53913431fa45ad2b611f6ddc746262d2610464e9882e2d1d8f7ee9de2"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.109505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" event={"ID":"75e7b399-bd4e-44b1-8c75-f0d81588911d","Type":"ContainerDied","Data":"34c8c8d435533f2dd600f3199b88ae1ef7665d2165907ca2caf74366aa3837fe"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.109625 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dvzk" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.122685 4749 generic.go:334] "Generic (PLEG): container finished" podID="c08511ac-9832-428c-be08-de0771ee5254" containerID="dc04157356b58916cc79bd9f71b43abe27a69fc6eabca519d0226abce170be29" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.122821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerDied","Data":"dc04157356b58916cc79bd9f71b43abe27a69fc6eabca519d0226abce170be29"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.127349 4749 generic.go:334] "Generic (PLEG): container finished" podID="ab8ea474-0133-461f-8498-033785eb0a52" containerID="08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.127460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgb8q" event={"ID":"ab8ea474-0133-461f-8498-033785eb0a52","Type":"ContainerDied","Data":"08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.131584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" event={"ID":"ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be","Type":"ContainerDied","Data":"de5d3f088228a5521e555483dd53ea4fbb9bff13ef3dbc7fad15dd972157e326"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.131712 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.142728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" event={"ID":"bd13c040-8dc7-4134-8aaa-7c84f164ca0d","Type":"ContainerStarted","Data":"0ea0d69f13a25d79e0ed7ecfe78f3ea86f9e7fbd26cbf1ed5bee69b5f885a7cf"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.150350 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerID="3a01343d650fc3d5ef536514b1ee7e77aa51c7b3d3328e142d6c7e27b8ce4d74" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.150470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerDied","Data":"3a01343d650fc3d5ef536514b1ee7e77aa51c7b3d3328e142d6c7e27b8ce4d74"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.150514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n829z" event={"ID":"f5296034-50c7-42d3-b2f4-34f1d451be99","Type":"ContainerDied","Data":"2ee8ab63b794ec8a1af2d972801052c8369d089f8a084c8eae2d2d03ea21f94e"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.150529 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee8ab63b794ec8a1af2d972801052c8369d089f8a084c8eae2d2d03ea21f94e" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.152600 4749 generic.go:334] "Generic (PLEG): container finished" podID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerID="e08c68561fa1b193d25da3eebedc3905fd4b46309346b21b509b357f3de63e94" exitCode=0 Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.152688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerDied","Data":"e08c68561fa1b193d25da3eebedc3905fd4b46309346b21b509b357f3de63e94"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.155954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerStarted","Data":"1df99cd0bf285de11b8c76bf66c56faf36f31fde5f215a7d78cb3969d66ca427"} Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.200046 4749 scope.go:117] "RemoveContainer" containerID="97f2533fcd74a2260f355be2d5797d8343c672bd1c2de1597940db976c086b57" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.248822 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.273907 4749 scope.go:117] "RemoveContainer" containerID="d6435a4f5d56d6ef06bbf7476e8155aeb3664bb4427e545f6bd2838bbd7530e5" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.327733 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8f5f46b85-qlp4l"] Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.332576 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8f5f46b85-qlp4l"] Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.342661 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h"] Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.354468 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f4499ccf-g6z8h"] Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.392553 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dvzk"] Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.410116 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dvzk"] Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.437753 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-catalog-content\") pod \"f5296034-50c7-42d3-b2f4-34f1d451be99\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.437845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-utilities\") pod \"f5296034-50c7-42d3-b2f4-34f1d451be99\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.437890 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdj9\" (UniqueName: \"kubernetes.io/projected/f5296034-50c7-42d3-b2f4-34f1d451be99-kube-api-access-2zdj9\") pod \"f5296034-50c7-42d3-b2f4-34f1d451be99\" (UID: \"f5296034-50c7-42d3-b2f4-34f1d451be99\") " Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.448222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-utilities" (OuterVolumeSpecName: "utilities") pod "f5296034-50c7-42d3-b2f4-34f1d451be99" (UID: "f5296034-50c7-42d3-b2f4-34f1d451be99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.448527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5296034-50c7-42d3-b2f4-34f1d451be99-kube-api-access-2zdj9" (OuterVolumeSpecName: "kube-api-access-2zdj9") pod "f5296034-50c7-42d3-b2f4-34f1d451be99" (UID: "f5296034-50c7-42d3-b2f4-34f1d451be99"). InnerVolumeSpecName "kube-api-access-2zdj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.540811 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.540896 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdj9\" (UniqueName: \"kubernetes.io/projected/f5296034-50c7-42d3-b2f4-34f1d451be99-kube-api-access-2zdj9\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.600935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5296034-50c7-42d3-b2f4-34f1d451be99" (UID: "f5296034-50c7-42d3-b2f4-34f1d451be99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.619067 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e7b399-bd4e-44b1-8c75-f0d81588911d" path="/var/lib/kubelet/pods/75e7b399-bd4e-44b1-8c75-f0d81588911d/volumes" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.619984 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f37ab7-5993-4a93-848c-2bf22bd81cb1" path="/var/lib/kubelet/pods/a1f37ab7-5993-4a93-848c-2bf22bd81cb1/volumes" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.620647 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be" path="/var/lib/kubelet/pods/ff0e4ac1-6c65-49f2-ad39-b1aeadf5a0be/volumes" Mar 10 15:53:15 crc kubenswrapper[4749]: I0310 15:53:15.644982 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5296034-50c7-42d3-b2f4-34f1d451be99-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:15 crc kubenswrapper[4749]: E0310 15:53:15.705271 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5296034_50c7_42d3_b2f4_34f1d451be99.slice\": RecentStats: unable to find data in memory cache]" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.156237 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s"] Mar 10 15:53:16 crc kubenswrapper[4749]: E0310 15:53:16.156927 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="registry-server" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.156944 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="registry-server" Mar 10 15:53:16 crc kubenswrapper[4749]: E0310 15:53:16.156965 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="extract-content" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.157004 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="extract-content" Mar 10 15:53:16 crc kubenswrapper[4749]: E0310 15:53:16.157017 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="extract-utilities" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.157025 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="extract-utilities" Mar 10 15:53:16 crc kubenswrapper[4749]: E0310 15:53:16.157043 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f37ab7-5993-4a93-848c-2bf22bd81cb1" containerName="controller-manager" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.157051 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f37ab7-5993-4a93-848c-2bf22bd81cb1" containerName="controller-manager" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.157195 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f37ab7-5993-4a93-848c-2bf22bd81cb1" containerName="controller-manager" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.157222 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" containerName="registry-server" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.157788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.161053 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.161050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.161640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.161679 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.165477 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.167436 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.177129 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh"] Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.178069 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.182182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgb8q" event={"ID":"ab8ea474-0133-461f-8498-033785eb0a52","Type":"ContainerStarted","Data":"383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.185243 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.195610 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.195862 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.195878 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.196005 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.196541 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.197850 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.207788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v6nf" event={"ID":"c56f09b3-981c-4a01-8ea3-4417c239cea6","Type":"ContainerStarted","Data":"7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.216292 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh"] Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.228538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" event={"ID":"bd13c040-8dc7-4134-8aaa-7c84f164ca0d","Type":"ContainerStarted","Data":"4f71e466749a2aa14b3953b1411ffa32ea716cb5b80238736380e0b65962a614"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.228644 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s"] Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.228682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.237995 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.239837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerStarted","Data":"6661a727a21d46b8a5a1510b46ed6493785e78b69ecb93945cd3d062fd3c2658"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.252946 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b65x\" (UniqueName: \"kubernetes.io/projected/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-kube-api-access-6b65x\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.253070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-serving-cert\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.253155 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-config\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.253227 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-client-ca\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.265826 4749 generic.go:334] "Generic (PLEG): container finished" podID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerID="1df99cd0bf285de11b8c76bf66c56faf36f31fde5f215a7d78cb3969d66ca427" exitCode=0 Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.266050 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerDied","Data":"1df99cd0bf285de11b8c76bf66c56faf36f31fde5f215a7d78cb3969d66ca427"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.279003 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dgb8q" podStartSLOduration=2.791139203 podStartE2EDuration="1m16.278979562s" podCreationTimestamp="2026-03-10 15:52:00 +0000 UTC" firstStartedPulling="2026-03-10 15:52:02.160756912 +0000 UTC m=+219.282622609" lastFinishedPulling="2026-03-10 15:53:15.648597281 +0000 UTC m=+292.770462968" observedRunningTime="2026-03-10 15:53:16.27682037 +0000 UTC m=+293.398686067" watchObservedRunningTime="2026-03-10 15:53:16.278979562 +0000 UTC m=+293.400845249" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.286793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerStarted","Data":"78ef571a6fca3e9288fe5770ce3e8751cae9a99e9e717c14b2253381d91777f1"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.305409 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2v6nf" podStartSLOduration=3.617799175 podStartE2EDuration="1m17.305364448s" podCreationTimestamp="2026-03-10 15:51:59 +0000 UTC" firstStartedPulling="2026-03-10 15:52:02.114810304 +0000 UTC m=+219.236675991" lastFinishedPulling="2026-03-10 15:53:15.802375577 +0000 UTC m=+292.924241264" observedRunningTime="2026-03-10 15:53:16.302892487 +0000 UTC m=+293.424758174" watchObservedRunningTime="2026-03-10 15:53:16.305364448 +0000 UTC m=+293.427230135" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.330838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerStarted","Data":"346a95741f6eb6b00b842d6247b7872a73bf9f15bbd8e3175bb62ceba9162149"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.333758 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bd8ffd566-xh9xh" podStartSLOduration=28.33373486 podStartE2EDuration="28.33373486s" podCreationTimestamp="2026-03-10 15:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:53:16.324064064 +0000 UTC m=+293.445929781" watchObservedRunningTime="2026-03-10 15:53:16.33373486 +0000 UTC m=+293.455600547" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.338358 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n829z" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.338673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerStarted","Data":"b994cb03a82d4cfa66f289ce148006f3adcba94dc9e16fe082b58d16293c0d1c"} Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-client-ca\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-client-ca\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b65x\" (UniqueName: \"kubernetes.io/projected/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-kube-api-access-6b65x\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-serving-cert\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqr9\" (UniqueName: \"kubernetes.io/projected/4216a52c-d779-462e-bfe5-4bf84bcd5684-kube-api-access-9sqr9\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-proxy-ca-bundles\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4216a52c-d779-462e-bfe5-4bf84bcd5684-serving-cert\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-config\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.356608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-config\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.368923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-client-ca\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.369259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-config\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.390318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b65x\" (UniqueName: \"kubernetes.io/projected/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-kube-api-access-6b65x\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.404273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-serving-cert\") pod \"route-controller-manager-5b44c8698d-xfk2s\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.449554 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-99764" podStartSLOduration=4.791413964 podStartE2EDuration="1m18.449529799s" podCreationTimestamp="2026-03-10 15:51:58 +0000 UTC" firstStartedPulling="2026-03-10 15:52:02.184827884 +0000 UTC m=+219.306693571" lastFinishedPulling="2026-03-10 15:53:15.842943719 +0000 UTC m=+292.964809406" observedRunningTime="2026-03-10 15:53:16.404705724 +0000 UTC m=+293.526571421" watchObservedRunningTime="2026-03-10 15:53:16.449529799 +0000 UTC m=+293.571395486" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.451432 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9zb9" podStartSLOduration=4.959819103 podStartE2EDuration="1m18.451421182s" podCreationTimestamp="2026-03-10 15:51:58 +0000 UTC" firstStartedPulling="2026-03-10 15:52:02.088474268 +0000 UTC m=+219.210339955" lastFinishedPulling="2026-03-10 15:53:15.580076347 +0000 UTC m=+292.701942034" observedRunningTime="2026-03-10 15:53:16.441015805 +0000 UTC m=+293.562881492" watchObservedRunningTime="2026-03-10 15:53:16.451421182 +0000 UTC m=+293.573286869" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.461463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqr9\" (UniqueName: \"kubernetes.io/projected/4216a52c-d779-462e-bfe5-4bf84bcd5684-kube-api-access-9sqr9\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.461531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-proxy-ca-bundles\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.461580 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4216a52c-d779-462e-bfe5-4bf84bcd5684-serving-cert\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.461609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-config\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.461678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-client-ca\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.463671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-config\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.464297 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-client-ca\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.465548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-proxy-ca-bundles\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.470018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4216a52c-d779-462e-bfe5-4bf84bcd5684-serving-cert\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.477745 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n829z"] Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.481844 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n829z"] Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.482817 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.504317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqr9\" (UniqueName: \"kubernetes.io/projected/4216a52c-d779-462e-bfe5-4bf84bcd5684-kube-api-access-9sqr9\") pod \"controller-manager-687ddc9bfb-mpbhh\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.511778 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.512172 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-btwnr" podStartSLOduration=4.683665809 podStartE2EDuration="1m18.512141533s" podCreationTimestamp="2026-03-10 15:51:58 +0000 UTC" firstStartedPulling="2026-03-10 15:52:02.027512569 +0000 UTC m=+219.149378256" lastFinishedPulling="2026-03-10 15:53:15.855988293 +0000 UTC m=+292.977853980" observedRunningTime="2026-03-10 15:53:16.51134436 +0000 UTC m=+293.633210057" watchObservedRunningTime="2026-03-10 15:53:16.512141533 +0000 UTC m=+293.634007220" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.549613 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rg2tg" podStartSLOduration=3.082547092 podStartE2EDuration="1m16.549590115s" podCreationTimestamp="2026-03-10 15:52:00 +0000 UTC" firstStartedPulling="2026-03-10 15:52:02.139942766 +0000 UTC m=+219.261808443" lastFinishedPulling="2026-03-10 15:53:15.606985779 +0000 UTC m=+292.728851466" observedRunningTime="2026-03-10 15:53:16.54798679 +0000 UTC m=+293.669852487" watchObservedRunningTime="2026-03-10 15:53:16.549590115 +0000 UTC m=+293.671455802" Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.884975 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh"] Mar 10 15:53:16 crc kubenswrapper[4749]: I0310 15:53:16.988183 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s"] Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.350192 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" event={"ID":"1df12ec3-a4bf-4816-9493-b4d61a6e48c7","Type":"ContainerStarted","Data":"94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27"} Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.350780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" event={"ID":"1df12ec3-a4bf-4816-9493-b4d61a6e48c7","Type":"ContainerStarted","Data":"5b2b10c39ed9e550b05204fa35adf2c0cb2eb858a76356df7d5518214404f130"} Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.351385 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.353895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" event={"ID":"4216a52c-d779-462e-bfe5-4bf84bcd5684","Type":"ContainerStarted","Data":"d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605"} Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.353950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" event={"ID":"4216a52c-d779-462e-bfe5-4bf84bcd5684","Type":"ContainerStarted","Data":"be4ffb2aebd88864b410690d7e172d7af93e9b931f22989429c774d7d7aa3482"} Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.354358 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.368124 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.404260 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" podStartSLOduration=6.404230982 podStartE2EDuration="6.404230982s" podCreationTimestamp="2026-03-10 15:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:53:17.402347449 +0000 UTC m=+294.524213146" watchObservedRunningTime="2026-03-10 15:53:17.404230982 +0000 UTC m=+294.526096679" Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.405019 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" podStartSLOduration=6.405008595 podStartE2EDuration="6.405008595s" podCreationTimestamp="2026-03-10 15:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:53:17.379011919 +0000 UTC m=+294.500877606" watchObservedRunningTime="2026-03-10 15:53:17.405008595 +0000 UTC m=+294.526874282" Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.622773 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5296034-50c7-42d3-b2f4-34f1d451be99" path="/var/lib/kubelet/pods/f5296034-50c7-42d3-b2f4-34f1d451be99/volumes" Mar 10 15:53:17 crc kubenswrapper[4749]: I0310 15:53:17.781499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.364759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerStarted","Data":"ff560fa5eda1c0f64bbe79d1ccb3f7ac2c8481d3c5837920f56bce2af60a39d5"} Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.817271 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.817342 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.871901 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.900955 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9qnqn" podStartSLOduration=3.478073636 podStartE2EDuration="1m17.900934935s" podCreationTimestamp="2026-03-10 15:52:01 +0000 UTC" firstStartedPulling="2026-03-10 15:52:03.261835879 +0000 UTC m=+220.383701566" lastFinishedPulling="2026-03-10 15:53:17.684697178 +0000 UTC m=+294.806562865" observedRunningTime="2026-03-10 15:53:18.398481629 +0000 UTC m=+295.520347326" watchObservedRunningTime="2026-03-10 15:53:18.900934935 +0000 UTC m=+296.022800622" Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.989101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:53:18 crc kubenswrapper[4749]: I0310 15:53:18.989187 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:53:19 crc kubenswrapper[4749]: I0310 15:53:19.200866 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:53:19 crc kubenswrapper[4749]: I0310 15:53:19.201034 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:53:19 crc kubenswrapper[4749]: I0310 15:53:19.402077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:53:19 crc kubenswrapper[4749]: I0310 15:53:19.402159 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:53:19 crc kubenswrapper[4749]: I0310 15:53:19.444026 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.046463 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-btwnr" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="registry-server" probeResult="failure" output=< Mar 10 15:53:20 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 15:53:20 crc kubenswrapper[4749]: > Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.251592 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-99764" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="registry-server" probeResult="failure" output=< Mar 10 15:53:20 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 15:53:20 crc kubenswrapper[4749]: > Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.835645 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.837207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.889707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.980748 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.980848 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.980951 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.982024 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:53:20 crc kubenswrapper[4749]: I0310 15:53:20.982111 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7" gracePeriod=600 Mar 10 15:53:21 crc kubenswrapper[4749]: I0310 15:53:21.208510 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:53:21 crc kubenswrapper[4749]: I0310 15:53:21.208615 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:53:21 crc kubenswrapper[4749]: I0310 15:53:21.258058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:53:21 crc kubenswrapper[4749]: I0310 15:53:21.441994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:53:21 crc kubenswrapper[4749]: I0310 15:53:21.442304 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:53:22 crc kubenswrapper[4749]: I0310 15:53:22.291297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:53:22 crc kubenswrapper[4749]: I0310 15:53:22.291419 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:53:22 crc kubenswrapper[4749]: I0310 15:53:22.408974 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7" exitCode=0 Mar 10 15:53:22 crc kubenswrapper[4749]: I0310 15:53:22.409079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7"} Mar 10 15:53:23 crc kubenswrapper[4749]: I0310 15:53:23.336718 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9qnqn" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="registry-server" probeResult="failure" output=< Mar 10 15:53:23 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 15:53:23 crc kubenswrapper[4749]: > Mar 10 15:53:23 crc kubenswrapper[4749]: I0310 15:53:23.419165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"235eec5455f54320042cab9b4bf8ef066c9980bb92c37290b73e4a493f648064"} Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.315921 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgb8q"] Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.316229 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dgb8q" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="registry-server" containerID="cri-o://383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a" gracePeriod=2 Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.817289 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.905497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqf9j\" (UniqueName: \"kubernetes.io/projected/ab8ea474-0133-461f-8498-033785eb0a52-kube-api-access-fqf9j\") pod \"ab8ea474-0133-461f-8498-033785eb0a52\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.905607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-catalog-content\") pod \"ab8ea474-0133-461f-8498-033785eb0a52\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.905728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-utilities\") pod \"ab8ea474-0133-461f-8498-033785eb0a52\" (UID: \"ab8ea474-0133-461f-8498-033785eb0a52\") " Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.906730 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-utilities" (OuterVolumeSpecName: "utilities") pod "ab8ea474-0133-461f-8498-033785eb0a52" (UID: "ab8ea474-0133-461f-8498-033785eb0a52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.912599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab8ea474-0133-461f-8498-033785eb0a52-kube-api-access-fqf9j" (OuterVolumeSpecName: "kube-api-access-fqf9j") pod "ab8ea474-0133-461f-8498-033785eb0a52" (UID: "ab8ea474-0133-461f-8498-033785eb0a52"). InnerVolumeSpecName "kube-api-access-fqf9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:24 crc kubenswrapper[4749]: I0310 15:53:24.943783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab8ea474-0133-461f-8498-033785eb0a52" (UID: "ab8ea474-0133-461f-8498-033785eb0a52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.007587 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqf9j\" (UniqueName: \"kubernetes.io/projected/ab8ea474-0133-461f-8498-033785eb0a52-kube-api-access-fqf9j\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.007634 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.007644 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab8ea474-0133-461f-8498-033785eb0a52-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.450156 4749 generic.go:334] "Generic (PLEG): container finished" podID="ab8ea474-0133-461f-8498-033785eb0a52" containerID="383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a" exitCode=0 Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.450267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgb8q" event={"ID":"ab8ea474-0133-461f-8498-033785eb0a52","Type":"ContainerDied","Data":"383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a"} Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.450311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgb8q" event={"ID":"ab8ea474-0133-461f-8498-033785eb0a52","Type":"ContainerDied","Data":"4206c37d121978a1471d5e590538fd97cd2b5119aad10997e31b4f46cf32e9bd"} Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.450367 4749 scope.go:117] "RemoveContainer" containerID="383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.450487 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgb8q" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.480868 4749 scope.go:117] "RemoveContainer" containerID="08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.485473 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgb8q"] Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.492789 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgb8q"] Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.500924 4749 scope.go:117] "RemoveContainer" containerID="25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.522557 4749 scope.go:117] "RemoveContainer" containerID="383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a" Mar 10 15:53:25 crc kubenswrapper[4749]: E0310 15:53:25.523171 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a\": container with ID starting with 383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a not found: ID does not exist" containerID="383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.523221 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a"} err="failed to get container status \"383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a\": rpc error: code = NotFound desc = could not find container \"383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a\": container with ID starting with 383984e9327e528abb84335dffa117856f2d8744163d8aa8673511fe131b664a not found: ID does not exist" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.523257 4749 scope.go:117] "RemoveContainer" containerID="08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e" Mar 10 15:53:25 crc kubenswrapper[4749]: E0310 15:53:25.523729 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e\": container with ID starting with 08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e not found: ID does not exist" containerID="08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.523768 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e"} err="failed to get container status \"08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e\": rpc error: code = NotFound desc = could not find container \"08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e\": container with ID starting with 08b403353a5ed69cbf2ca20bb9b89f9a64918081ed03a2917209192dd6b9fb9e not found: ID does not exist" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.523795 4749 scope.go:117] "RemoveContainer" containerID="25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1" Mar 10 15:53:25 crc kubenswrapper[4749]: E0310 15:53:25.524215 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1\": container with ID starting with 25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1 not found: ID does not exist" containerID="25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.524247 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1"} err="failed to get container status \"25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1\": rpc error: code = NotFound desc = could not find container \"25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1\": container with ID starting with 25045fc1e098586fa59f0cc33c8dcc72045574dd47f57631ed3a547de9dfadc1 not found: ID does not exist" Mar 10 15:53:25 crc kubenswrapper[4749]: I0310 15:53:25.616485 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab8ea474-0133-461f-8498-033785eb0a52" path="/var/lib/kubelet/pods/ab8ea474-0133-461f-8498-033785eb0a52/volumes" Mar 10 15:53:28 crc kubenswrapper[4749]: I0310 15:53:28.864051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:53:29 crc kubenswrapper[4749]: I0310 15:53:29.039214 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:53:29 crc kubenswrapper[4749]: I0310 15:53:29.107359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:53:29 crc kubenswrapper[4749]: I0310 15:53:29.239544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:53:29 crc kubenswrapper[4749]: I0310 15:53:29.283837 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:53:29 crc kubenswrapper[4749]: I0310 15:53:29.450213 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:53:29 crc kubenswrapper[4749]: I0310 15:53:29.918585 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-99764"] Mar 10 15:53:30 crc kubenswrapper[4749]: I0310 15:53:30.486993 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-99764" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="registry-server" containerID="cri-o://6661a727a21d46b8a5a1510b46ed6493785e78b69ecb93945cd3d062fd3c2658" gracePeriod=2 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.087035 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.088071 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="extract-utilities" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.088184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="extract-utilities" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.088288 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="registry-server" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.088406 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="registry-server" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.088698 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="extract-content" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.088788 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="extract-content" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.089057 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab8ea474-0133-461f-8498-033785eb0a52" containerName="registry-server" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.089795 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.089937 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.089993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.090298 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090402 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.090499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090688 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090785 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902" gracePeriod=15 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090710 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa" gracePeriod=15 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090743 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e" gracePeriod=15 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090761 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2" gracePeriod=15 Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.090786 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091181 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.090678 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5" gracePeriod=15 Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.091265 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091276 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.091292 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091317 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.091334 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091342 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.091355 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091363 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.091430 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091439 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091731 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091743 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091753 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091760 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091768 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091777 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091785 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091794 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.091905 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.091913 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.092013 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.092141 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.092157 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.100668 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.133851 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.200905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.200965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.200999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.201026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.201111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.201158 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.201179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.201210 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.304139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.303884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.304214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.304332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.304531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.434862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.482969 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b85d11cfa36c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:53:31.481241287 +0000 UTC m=+308.603106974,LastTimestamp:2026-03-10 15:53:31.481241287 +0000 UTC m=+308.603106974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.508864 4749 generic.go:334] "Generic (PLEG): container finished" podID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" containerID="00f3345507ae95828c376056f82d3f59525a74ff0bbcedc9332b197bd4d91b13" exitCode=0 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.508959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df","Type":"ContainerDied","Data":"00f3345507ae95828c376056f82d3f59525a74ff0bbcedc9332b197bd4d91b13"} Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.510259 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.512007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"91f95740e4b2eadf1ac0a89a253dd4f515307d2d87f3560730ef1762a5415def"} Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.515471 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.517628 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.518530 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2" exitCode=0 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.518554 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa" exitCode=0 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.518562 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e" exitCode=0 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.518572 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902" exitCode=2 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.518680 4749 scope.go:117] "RemoveContainer" containerID="17580c954cad2d30ff3a77cdd5cc2c90e54296ed2e2e02cea81cdef6f29afc8b" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.524934 4749 generic.go:334] "Generic (PLEG): container finished" podID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerID="6661a727a21d46b8a5a1510b46ed6493785e78b69ecb93945cd3d062fd3c2658" exitCode=0 Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.524988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerDied","Data":"6661a727a21d46b8a5a1510b46ed6493785e78b69ecb93945cd3d062fd3c2658"} Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.627492 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.628591 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.629536 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.711002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-utilities\") pod \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.711780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-catalog-content\") pod \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.711899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdhgk\" (UniqueName: \"kubernetes.io/projected/b27649fa-b5c8-4aca-9de3-37f171af6e1c-kube-api-access-jdhgk\") pod \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\" (UID: \"b27649fa-b5c8-4aca-9de3-37f171af6e1c\") " Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.713417 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-utilities" (OuterVolumeSpecName: "utilities") pod "b27649fa-b5c8-4aca-9de3-37f171af6e1c" (UID: "b27649fa-b5c8-4aca-9de3-37f171af6e1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.716715 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27649fa-b5c8-4aca-9de3-37f171af6e1c-kube-api-access-jdhgk" (OuterVolumeSpecName: "kube-api-access-jdhgk") pod "b27649fa-b5c8-4aca-9de3-37f171af6e1c" (UID: "b27649fa-b5c8-4aca-9de3-37f171af6e1c"). InnerVolumeSpecName "kube-api-access-jdhgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.777024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b27649fa-b5c8-4aca-9de3-37f171af6e1c" (UID: "b27649fa-b5c8-4aca-9de3-37f171af6e1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.813280 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdhgk\" (UniqueName: \"kubernetes.io/projected/b27649fa-b5c8-4aca-9de3-37f171af6e1c-kube-api-access-jdhgk\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.813847 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:31 crc kubenswrapper[4749]: I0310 15:53:31.813953 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27649fa-b5c8-4aca-9de3-37f171af6e1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:31 crc kubenswrapper[4749]: E0310 15:53:31.882948 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b85d11cfa36c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:53:31.481241287 +0000 UTC m=+308.603106974,LastTimestamp:2026-03-10 15:53:31.481241287 +0000 UTC m=+308.603106974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.338227 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.339646 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.340073 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.340251 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.393002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.393919 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.394599 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.394940 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.535918 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.540484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-99764" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.540475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-99764" event={"ID":"b27649fa-b5c8-4aca-9de3-37f171af6e1c","Type":"ContainerDied","Data":"752582a86847c76962e6a06d3fbe063f9b1027144cbb97171c14f9bdac8b7de0"} Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.540632 4749 scope.go:117] "RemoveContainer" containerID="6661a727a21d46b8a5a1510b46ed6493785e78b69ecb93945cd3d062fd3c2658" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.541533 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.541868 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.542362 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.542807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b"} Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.543695 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.543892 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: E0310 15:53:32.543891 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.544147 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.558815 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.559911 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.560645 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.561292 4749 scope.go:117] "RemoveContainer" containerID="e08c68561fa1b193d25da3eebedc3905fd4b46309346b21b509b357f3de63e94" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.599355 4749 scope.go:117] "RemoveContainer" containerID="0fd1cc7dab6111c3999b0d6c0bf61799066e237bf84fef7310ae2043dece1183" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.881202 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.882332 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.882772 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:32 crc kubenswrapper[4749]: I0310 15:53:32.883289 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.031991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kube-api-access\") pod \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.032126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kubelet-dir\") pod \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.032177 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-var-lock\") pod \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\" (UID: \"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df\") " Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.032319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" (UID: "e8fcdb6a-3cda-4f8a-8353-63e0a7e833df"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.032366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-var-lock" (OuterVolumeSpecName: "var-lock") pod "e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" (UID: "e8fcdb6a-3cda-4f8a-8353-63e0a7e833df"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.032741 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.032768 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.038180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" (UID: "e8fcdb6a-3cda-4f8a-8353-63e0a7e833df"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.137495 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8fcdb6a-3cda-4f8a-8353-63e0a7e833df-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.453915 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.455119 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.455900 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.456127 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.456519 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.457089 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.543818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.543880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.543924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.543993 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.544045 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.544128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.544271 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.544284 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.544292 4749 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.551366 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.551424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8fcdb6a-3cda-4f8a-8353-63e0a7e833df","Type":"ContainerDied","Data":"dce4195d0c301a33f8ffbbe07338d3532a73a341831b0da9a0a3d49288ee519d"} Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.551476 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce4195d0c301a33f8ffbbe07338d3532a73a341831b0da9a0a3d49288ee519d" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.555172 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.556575 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5" exitCode=0 Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.556767 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.556812 4749 scope.go:117] "RemoveContainer" containerID="ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.557496 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.565619 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.565935 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.566401 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.566665 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.573160 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.573961 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.574414 4749 scope.go:117] "RemoveContainer" containerID="388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.574644 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.574873 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.594735 4749 scope.go:117] "RemoveContainer" containerID="046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.610141 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.610583 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.611120 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.611505 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.611862 4749 scope.go:117] "RemoveContainer" containerID="44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.618119 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.630630 4749 scope.go:117] "RemoveContainer" containerID="63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.648865 4749 scope.go:117] "RemoveContainer" containerID="07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.670979 4749 scope.go:117] "RemoveContainer" containerID="ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.671354 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\": container with ID starting with ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2 not found: ID does not exist" containerID="ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.671449 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2"} err="failed to get container status \"ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\": rpc error: code = NotFound desc = could not find container \"ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2\": container with ID starting with ea7ef8e498d6d86bc4a493ba61e5480181c52759e95e45ccc467c2c948945ef2 not found: ID does not exist" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.671483 4749 scope.go:117] "RemoveContainer" containerID="388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.672029 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\": container with ID starting with 388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa not found: ID does not exist" containerID="388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.672165 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa"} err="failed to get container status \"388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\": rpc error: code = NotFound desc = could not find container \"388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa\": container with ID starting with 388dfe019ad56997dad37198190612fd7fbd9c0e78516a87964ad78f13517eaa not found: ID does not exist" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.672284 4749 scope.go:117] "RemoveContainer" containerID="046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.672769 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\": container with ID starting with 046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e not found: ID does not exist" containerID="046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.672804 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e"} err="failed to get container status \"046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\": rpc error: code = NotFound desc = could not find container \"046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e\": container with ID starting with 046727c106438d94691209476f56cdcff99f78ae3631abadad6cd3b0a57a4f9e not found: ID does not exist" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.672826 4749 scope.go:117] "RemoveContainer" containerID="44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.673091 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\": container with ID starting with 44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902 not found: ID does not exist" containerID="44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.673191 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902"} err="failed to get container status \"44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\": rpc error: code = NotFound desc = could not find container \"44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902\": container with ID starting with 44de13f6a4cc66b2a28eb21055e266e9446ed262da56ba62d6fecf1eb00ef902 not found: ID does not exist" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.673219 4749 scope.go:117] "RemoveContainer" containerID="63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.675752 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\": container with ID starting with 63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5 not found: ID does not exist" containerID="63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.675875 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5"} err="failed to get container status \"63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\": rpc error: code = NotFound desc = could not find container \"63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5\": container with ID starting with 63f3ae04d7314780bc2d0a00629b2c8df98bff1d2e740b7ebc2444f1bfd1a2d5 not found: ID does not exist" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.675965 4749 scope.go:117] "RemoveContainer" containerID="07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777" Mar 10 15:53:33 crc kubenswrapper[4749]: E0310 15:53:33.676674 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\": container with ID starting with 07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777 not found: ID does not exist" containerID="07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777" Mar 10 15:53:33 crc kubenswrapper[4749]: I0310 15:53:33.676737 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777"} err="failed to get container status \"07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\": rpc error: code = NotFound desc = could not find container \"07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777\": container with ID starting with 07a55fe9deeb6cf3405c40d8ef6c234a08e37c8ecab5713525fdfd8d1653a777 not found: ID does not exist" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.448143 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.449166 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.449712 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.450094 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.450446 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:39 crc kubenswrapper[4749]: I0310 15:53:39.450487 4749 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.450790 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="200ms" Mar 10 15:53:39 crc kubenswrapper[4749]: E0310 15:53:39.651531 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="400ms" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.052817 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="800ms" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.266793 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:53:40Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:53:40Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:53:40Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:53:40Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.267656 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.268210 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.268899 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.269692 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.269743 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:53:40 crc kubenswrapper[4749]: E0310 15:53:40.853765 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="1.6s" Mar 10 15:53:41 crc kubenswrapper[4749]: E0310 15:53:41.884518 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b85d11cfa36c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:53:31.481241287 +0000 UTC m=+308.603106974,LastTimestamp:2026-03-10 15:53:31.481241287 +0000 UTC m=+308.603106974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:53:42 crc kubenswrapper[4749]: E0310 15:53:42.454633 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="3.2s" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.609616 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.610273 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.610771 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.631963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.632588 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.632647 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108" exitCode=1 Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.632756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108"} Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.633481 4749 scope.go:117] "RemoveContainer" containerID="ecec5e9cde8a55f05fd3d4dfdde78e1cd028666fdf5daafd7f4748dd1859f108" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.633918 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.634464 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.634756 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.635028 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:43 crc kubenswrapper[4749]: I0310 15:53:43.663977 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.644697 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.647826 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.647933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"724da2d3166c931f5af34c32c45e7cd4cdf447896ad947f0ed737a8d694d720d"} Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.649370 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.650207 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.650877 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:44 crc kubenswrapper[4749]: I0310 15:53:44.651121 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:45 crc kubenswrapper[4749]: I0310 15:53:45.536908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:53:45 crc kubenswrapper[4749]: E0310 15:53:45.655658 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="6.4s" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.605902 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.607142 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.607870 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.608356 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.608828 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.623232 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.623282 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:46 crc kubenswrapper[4749]: E0310 15:53:46.624725 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.625445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:46 crc kubenswrapper[4749]: W0310 15:53:46.650196 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-598a9a802b88b4b925cc788ee51751728dddf23c585d1981563315c6865171d8 WatchSource:0}: Error finding container 598a9a802b88b4b925cc788ee51751728dddf23c585d1981563315c6865171d8: Status 404 returned error can't find the container with id 598a9a802b88b4b925cc788ee51751728dddf23c585d1981563315c6865171d8 Mar 10 15:53:46 crc kubenswrapper[4749]: I0310 15:53:46.662410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"598a9a802b88b4b925cc788ee51751728dddf23c585d1981563315c6865171d8"} Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.671398 4749 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0ab11d1121a749de2db5921841e81ca51e01e0e8373658a65ff0b902c9bdc529" exitCode=0 Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.671463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0ab11d1121a749de2db5921841e81ca51e01e0e8373658a65ff0b902c9bdc529"} Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.671862 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.671897 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.672343 4749 status_manager.go:851] "Failed to get status for pod" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" pod="openshift-marketplace/redhat-operators-9qnqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-9qnqn\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:47 crc kubenswrapper[4749]: E0310 15:53:47.672501 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.672655 4749 status_manager.go:851] "Failed to get status for pod" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" pod="openshift-marketplace/certified-operators-99764" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-99764\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.672948 4749 status_manager.go:851] "Failed to get status for pod" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:47 crc kubenswrapper[4749]: I0310 15:53:47.673223 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Mar 10 15:53:47 crc kubenswrapper[4749]: E0310 15:53:47.687503 4749 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" volumeName="registry-storage" Mar 10 15:53:48 crc kubenswrapper[4749]: I0310 15:53:48.684538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2bb216a059d62bb89a1e64437601a8fae9f0ed590631eaf4f7fd64feaea7485f"} Mar 10 15:53:48 crc kubenswrapper[4749]: I0310 15:53:48.684933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11d8a4706f763aa783bf57e522670b8ea534b5095898d5f49b2b231b5a3d857b"} Mar 10 15:53:48 crc kubenswrapper[4749]: I0310 15:53:48.684945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6861f5338a4da69ffd21e03750006a8f872f8d23b0d53504b860c33d15fbd1d0"} Mar 10 15:53:49 crc kubenswrapper[4749]: I0310 15:53:49.695597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d2e4bbf68a91933bc997fe7a58d722ca9bf6e88f6917afe9cba9e2eb46f97bc6"} Mar 10 15:53:49 crc kubenswrapper[4749]: I0310 15:53:49.696332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"13c0a2d73d06c9e78134b30a21815280eeacb2c916937a7cb8b0d5a6a5f0b865"} Mar 10 15:53:49 crc kubenswrapper[4749]: I0310 15:53:49.696354 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:49 crc kubenswrapper[4749]: I0310 15:53:49.696010 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:49 crc kubenswrapper[4749]: I0310 15:53:49.696418 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:51 crc kubenswrapper[4749]: I0310 15:53:51.626013 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:51 crc kubenswrapper[4749]: I0310 15:53:51.626062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:51 crc kubenswrapper[4749]: I0310 15:53:51.632731 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:53 crc kubenswrapper[4749]: I0310 15:53:53.663561 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:53:53 crc kubenswrapper[4749]: I0310 15:53:53.668524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:53:53 crc kubenswrapper[4749]: I0310 15:53:53.726985 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:53:54 crc kubenswrapper[4749]: I0310 15:53:54.712754 4749 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:54 crc kubenswrapper[4749]: I0310 15:53:54.805190 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0e7d01b5-fe6b-4a6a-8f2b-a3878ad2e496" Mar 10 15:53:55 crc kubenswrapper[4749]: I0310 15:53:55.736611 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:55 crc kubenswrapper[4749]: I0310 15:53:55.736659 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:55 crc kubenswrapper[4749]: I0310 15:53:55.740789 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:53:55 crc kubenswrapper[4749]: I0310 15:53:55.741113 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0e7d01b5-fe6b-4a6a-8f2b-a3878ad2e496" Mar 10 15:53:56 crc kubenswrapper[4749]: I0310 15:53:56.745270 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:56 crc kubenswrapper[4749]: I0310 15:53:56.746179 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="2c4083e1-ef1a-4d75-9c21-20e180f6a5e6" Mar 10 15:53:56 crc kubenswrapper[4749]: I0310 15:53:56.749261 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0e7d01b5-fe6b-4a6a-8f2b-a3878ad2e496" Mar 10 15:54:04 crc kubenswrapper[4749]: I0310 15:54:04.104286 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 15:54:04 crc kubenswrapper[4749]: I0310 15:54:04.588768 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.242179 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.448860 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.494961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.608850 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.687582 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.714241 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.774199 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 15:54:05 crc kubenswrapper[4749]: I0310 15:54:05.868822 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:54:06 crc kubenswrapper[4749]: I0310 15:54:06.535632 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 15:54:06 crc kubenswrapper[4749]: I0310 15:54:06.700212 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.033209 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.214408 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.229950 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.238689 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.454101 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.537688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.544920 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.625665 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.673816 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.675431 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.687468 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.691015 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.980491 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.982166 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.982972 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.986565 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/certified-operators-99764"] Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.986647 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.992990 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 15:54:07 crc kubenswrapper[4749]: I0310 15:54:07.994478 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.009241 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.00921321 podStartE2EDuration="14.00921321s" podCreationTimestamp="2026-03-10 15:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:54:08.004252339 +0000 UTC m=+345.126118036" watchObservedRunningTime="2026-03-10 15:54:08.00921321 +0000 UTC m=+345.131078897" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.011248 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.013267 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.072339 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.104661 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.117204 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.178244 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.209140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.399665 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.432117 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.497137 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.514521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.532699 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.533052 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.541864 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.598066 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.686548 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.696778 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.825567 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.839601 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.874064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.932435 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 15:54:08 crc kubenswrapper[4749]: I0310 15:54:08.997090 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.007887 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.134598 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.286654 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.305906 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.575903 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.597593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.615588 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" path="/var/lib/kubelet/pods/b27649fa-b5c8-4aca-9de3-37f171af6e1c/volumes" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.669623 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.720993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.778926 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.846737 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:54:09 crc kubenswrapper[4749]: I0310 15:54:09.911698 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:09.979679 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.079740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.080159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.123966 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.166897 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.170707 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.207778 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.225607 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.362240 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.453600 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.470481 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.532212 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.647326 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.662664 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.708918 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 15:54:10 crc kubenswrapper[4749]: I0310 15:54:10.749306 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.058593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.076076 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.106346 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.193913 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.221737 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.372235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.382717 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.413491 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.448195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.454936 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.571521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.595417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.652367 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.673079 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.687671 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.716301 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.770080 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.959691 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.965821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.972319 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 15:54:11 crc kubenswrapper[4749]: I0310 15:54:11.991367 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.005846 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.041157 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.053261 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.088856 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.094354 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.112140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.173778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.251688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.276602 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s"] Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.276869 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" podUID="1df12ec3-a4bf-4816-9493-b4d61a6e48c7" containerName="route-controller-manager" containerID="cri-o://94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27" gracePeriod=30 Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.309492 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.340754 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.343407 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.365356 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh"] Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.365742 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" podUID="4216a52c-d779-462e-bfe5-4bf84bcd5684" containerName="controller-manager" containerID="cri-o://d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605" gracePeriod=30 Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.401610 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.414720 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.424574 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552634-dv6cc"] Mar 10 15:54:12 crc kubenswrapper[4749]: E0310 15:54:12.424853 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="registry-server" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.424869 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="registry-server" Mar 10 15:54:12 crc kubenswrapper[4749]: E0310 15:54:12.424883 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="extract-utilities" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.424891 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="extract-utilities" Mar 10 15:54:12 crc kubenswrapper[4749]: E0310 15:54:12.424903 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="extract-content" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.424909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="extract-content" Mar 10 15:54:12 crc kubenswrapper[4749]: E0310 15:54:12.424920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" containerName="installer" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.424928 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" containerName="installer" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.425047 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27649fa-b5c8-4aca-9de3-37f171af6e1c" containerName="registry-server" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.425060 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fcdb6a-3cda-4f8a-8353-63e0a7e833df" containerName="installer" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.425511 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.431819 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.432112 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.432268 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.470331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzm4\" (UniqueName: \"kubernetes.io/projected/59539c85-68ad-4b62-8484-ccda9def3258-kube-api-access-ljzm4\") pod \"auto-csr-approver-29552634-dv6cc\" (UID: \"59539c85-68ad-4b62-8484-ccda9def3258\") " pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.572429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzm4\" (UniqueName: \"kubernetes.io/projected/59539c85-68ad-4b62-8484-ccda9def3258-kube-api-access-ljzm4\") pod \"auto-csr-approver-29552634-dv6cc\" (UID: \"59539c85-68ad-4b62-8484-ccda9def3258\") " pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.603413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzm4\" (UniqueName: \"kubernetes.io/projected/59539c85-68ad-4b62-8484-ccda9def3258-kube-api-access-ljzm4\") pod \"auto-csr-approver-29552634-dv6cc\" (UID: \"59539c85-68ad-4b62-8484-ccda9def3258\") " pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.708761 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.751755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.758631 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.767250 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.814164 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.835089 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.855141 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.856242 4749 generic.go:334] "Generic (PLEG): container finished" podID="4216a52c-d779-462e-bfe5-4bf84bcd5684" containerID="d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605" exitCode=0 Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.856342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" event={"ID":"4216a52c-d779-462e-bfe5-4bf84bcd5684","Type":"ContainerDied","Data":"d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605"} Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.856394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" event={"ID":"4216a52c-d779-462e-bfe5-4bf84bcd5684","Type":"ContainerDied","Data":"be4ffb2aebd88864b410690d7e172d7af93e9b931f22989429c774d7d7aa3482"} Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.856420 4749 scope.go:117] "RemoveContainer" containerID="d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.882959 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-client-ca\") pod \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-client-ca\") pod \"4216a52c-d779-462e-bfe5-4bf84bcd5684\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-proxy-ca-bundles\") pod \"4216a52c-d779-462e-bfe5-4bf84bcd5684\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-config\") pod \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-serving-cert\") pod \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4216a52c-d779-462e-bfe5-4bf84bcd5684-serving-cert\") pod \"4216a52c-d779-462e-bfe5-4bf84bcd5684\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sqr9\" (UniqueName: \"kubernetes.io/projected/4216a52c-d779-462e-bfe5-4bf84bcd5684-kube-api-access-9sqr9\") pod \"4216a52c-d779-462e-bfe5-4bf84bcd5684\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883274 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-config\") pod \"4216a52c-d779-462e-bfe5-4bf84bcd5684\" (UID: \"4216a52c-d779-462e-bfe5-4bf84bcd5684\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.883308 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b65x\" (UniqueName: \"kubernetes.io/projected/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-kube-api-access-6b65x\") pod \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\" (UID: \"1df12ec3-a4bf-4816-9493-b4d61a6e48c7\") " Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.888469 4749 generic.go:334] "Generic (PLEG): container finished" podID="1df12ec3-a4bf-4816-9493-b4d61a6e48c7" containerID="94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27" exitCode=0 Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.888536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" event={"ID":"1df12ec3-a4bf-4816-9493-b4d61a6e48c7","Type":"ContainerDied","Data":"94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27"} Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.888571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" event={"ID":"1df12ec3-a4bf-4816-9493-b4d61a6e48c7","Type":"ContainerDied","Data":"5b2b10c39ed9e550b05204fa35adf2c0cb2eb858a76356df7d5518214404f130"} Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.888643 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.888971 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-config" (OuterVolumeSpecName: "config") pod "1df12ec3-a4bf-4816-9493-b4d61a6e48c7" (UID: "1df12ec3-a4bf-4816-9493-b4d61a6e48c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.889304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "1df12ec3-a4bf-4816-9493-b4d61a6e48c7" (UID: "1df12ec3-a4bf-4816-9493-b4d61a6e48c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.889638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-client-ca" (OuterVolumeSpecName: "client-ca") pod "4216a52c-d779-462e-bfe5-4bf84bcd5684" (UID: "4216a52c-d779-462e-bfe5-4bf84bcd5684"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.890110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-config" (OuterVolumeSpecName: "config") pod "4216a52c-d779-462e-bfe5-4bf84bcd5684" (UID: "4216a52c-d779-462e-bfe5-4bf84bcd5684"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.890525 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4216a52c-d779-462e-bfe5-4bf84bcd5684" (UID: "4216a52c-d779-462e-bfe5-4bf84bcd5684"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.892275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4216a52c-d779-462e-bfe5-4bf84bcd5684-kube-api-access-9sqr9" (OuterVolumeSpecName: "kube-api-access-9sqr9") pod "4216a52c-d779-462e-bfe5-4bf84bcd5684" (UID: "4216a52c-d779-462e-bfe5-4bf84bcd5684"). InnerVolumeSpecName "kube-api-access-9sqr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.893655 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4216a52c-d779-462e-bfe5-4bf84bcd5684-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4216a52c-d779-462e-bfe5-4bf84bcd5684" (UID: "4216a52c-d779-462e-bfe5-4bf84bcd5684"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.894045 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1df12ec3-a4bf-4816-9493-b4d61a6e48c7" (UID: "1df12ec3-a4bf-4816-9493-b4d61a6e48c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.898741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-kube-api-access-6b65x" (OuterVolumeSpecName: "kube-api-access-6b65x") pod "1df12ec3-a4bf-4816-9493-b4d61a6e48c7" (UID: "1df12ec3-a4bf-4816-9493-b4d61a6e48c7"). InnerVolumeSpecName "kube-api-access-6b65x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.933917 4749 scope.go:117] "RemoveContainer" containerID="d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605" Mar 10 15:54:12 crc kubenswrapper[4749]: E0310 15:54:12.936005 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605\": container with ID starting with d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605 not found: ID does not exist" containerID="d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.936064 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605"} err="failed to get container status \"d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605\": rpc error: code = NotFound desc = could not find container \"d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605\": container with ID starting with d2cae0879fb9d1a8d7cad43c5fcfe923083ac82eec2de5fd174b4e9e5b165605 not found: ID does not exist" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.936097 4749 scope.go:117] "RemoveContainer" containerID="94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.956082 4749 scope.go:117] "RemoveContainer" containerID="94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27" Mar 10 15:54:12 crc kubenswrapper[4749]: E0310 15:54:12.957534 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27\": container with ID starting with 94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27 not found: ID does not exist" containerID="94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.957606 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27"} err="failed to get container status \"94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27\": rpc error: code = NotFound desc = could not find container \"94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27\": container with ID starting with 94597103cc61638790fdb3b4dff05144eaeb383c148e883ab6f42db0af361a27 not found: ID does not exist" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.960361 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.968326 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985523 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985565 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985576 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985591 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985602 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985612 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4216a52c-d779-462e-bfe5-4bf84bcd5684-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985624 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sqr9\" (UniqueName: \"kubernetes.io/projected/4216a52c-d779-462e-bfe5-4bf84bcd5684-kube-api-access-9sqr9\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985635 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4216a52c-d779-462e-bfe5-4bf84bcd5684-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:12 crc kubenswrapper[4749]: I0310 15:54:12.985645 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b65x\" (UniqueName: \"kubernetes.io/projected/1df12ec3-a4bf-4816-9493-b4d61a6e48c7-kube-api-access-6b65x\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.039700 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.072432 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.079241 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.087240 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.099312 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.111985 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.214269 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s"] Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.218116 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b44c8698d-xfk2s"] Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.225501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.280099 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.352847 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.431748 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.434778 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.515795 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.613855 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df12ec3-a4bf-4816-9493-b4d61a6e48c7" path="/var/lib/kubelet/pods/1df12ec3-a4bf-4816-9493-b4d61a6e48c7/volumes" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.615625 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.655770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.681240 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.779251 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.909699 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.919923 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.922183 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.922238 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh" Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.942617 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh"] Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.947541 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-687ddc9bfb-mpbhh"] Mar 10 15:54:13 crc kubenswrapper[4749]: I0310 15:54:13.959109 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.019480 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.117863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.174036 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.193174 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq"] Mar 10 15:54:14 crc kubenswrapper[4749]: E0310 15:54:14.193557 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df12ec3-a4bf-4816-9493-b4d61a6e48c7" containerName="route-controller-manager" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.193586 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df12ec3-a4bf-4816-9493-b4d61a6e48c7" containerName="route-controller-manager" Mar 10 15:54:14 crc kubenswrapper[4749]: E0310 15:54:14.193611 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4216a52c-d779-462e-bfe5-4bf84bcd5684" containerName="controller-manager" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.193621 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4216a52c-d779-462e-bfe5-4bf84bcd5684" containerName="controller-manager" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.193745 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df12ec3-a4bf-4816-9493-b4d61a6e48c7" containerName="route-controller-manager" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.193772 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4216a52c-d779-462e-bfe5-4bf84bcd5684" containerName="controller-manager" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.194333 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.196869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.197644 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.198977 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.199052 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.199108 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.199157 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.199280 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.199606 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-9rrnn"] Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.200554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.205738 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.206187 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.206683 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.207334 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.208217 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.211120 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.215948 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.224684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.249136 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d530d0b-3ad3-416e-a600-997f7ed4f976-serving-cert\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-config\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49qm\" (UniqueName: \"kubernetes.io/projected/1d530d0b-3ad3-416e-a600-997f7ed4f976-kube-api-access-l49qm\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-config\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-client-ca\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-proxy-ca-bundles\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.318974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-client-ca\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.319094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2858d6db-ce93-467d-8eba-54579431e3ae-serving-cert\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.319224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8f5\" (UniqueName: \"kubernetes.io/projected/2858d6db-ce93-467d-8eba-54579431e3ae-kube-api-access-4b8f5\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.354303 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8f5\" (UniqueName: \"kubernetes.io/projected/2858d6db-ce93-467d-8eba-54579431e3ae-kube-api-access-4b8f5\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d530d0b-3ad3-416e-a600-997f7ed4f976-serving-cert\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-config\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l49qm\" (UniqueName: \"kubernetes.io/projected/1d530d0b-3ad3-416e-a600-997f7ed4f976-kube-api-access-l49qm\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-config\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-client-ca\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-proxy-ca-bundles\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-client-ca\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.420834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2858d6db-ce93-467d-8eba-54579431e3ae-serving-cert\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.422191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-client-ca\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.422261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-client-ca\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.422457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-config\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.422496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-proxy-ca-bundles\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.422536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-config\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.428128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2858d6db-ce93-467d-8eba-54579431e3ae-serving-cert\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.429767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d530d0b-3ad3-416e-a600-997f7ed4f976-serving-cert\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.440134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8f5\" (UniqueName: \"kubernetes.io/projected/2858d6db-ce93-467d-8eba-54579431e3ae-kube-api-access-4b8f5\") pod \"route-controller-manager-7678dbd556-pgnbq\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.443843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49qm\" (UniqueName: \"kubernetes.io/projected/1d530d0b-3ad3-416e-a600-997f7ed4f976-kube-api-access-l49qm\") pod \"controller-manager-67c8b7b957-9rrnn\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.454181 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.505473 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.515957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.526008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.643207 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.758189 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.787751 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.798715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.845791 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.866977 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.944722 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.952247 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.996614 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 15:54:14 crc kubenswrapper[4749]: I0310 15:54:14.998053 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.155208 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.232418 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.320673 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.346390 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.604236 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.614978 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4216a52c-d779-462e-bfe5-4bf84bcd5684" path="/var/lib/kubelet/pods/4216a52c-d779-462e-bfe5-4bf84bcd5684/volumes" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.719481 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.806787 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 15:54:15 crc kubenswrapper[4749]: I0310 15:54:15.925338 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.007474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.122709 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.181239 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.185898 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.302252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.303822 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.446256 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.487633 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.504340 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.518693 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.532954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.632540 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.691842 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.698407 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.718481 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.733410 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.733577 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.747726 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.770181 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.817829 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.930398 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 15:54:16 crc kubenswrapper[4749]: I0310 15:54:16.983600 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.008920 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.053946 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.062360 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.163756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.218624 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.219309 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b" gracePeriod=5 Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.233807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.238750 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.430028 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.753276 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.776325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.799525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.822109 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.927493 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.967627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 15:54:17 crc kubenswrapper[4749]: I0310 15:54:17.972718 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.094849 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.124037 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.246948 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.256125 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.411111 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.607204 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.914230 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 15:54:18 crc kubenswrapper[4749]: I0310 15:54:18.957475 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.005559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-9rrnn"] Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.010416 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-dv6cc"] Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.016685 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq"] Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.341146 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.353356 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.451611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq"] Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.491032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-dv6cc"] Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.544758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-9rrnn"] Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.551802 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 15:54:19 crc kubenswrapper[4749]: W0310 15:54:19.558289 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d530d0b_3ad3_416e_a600_997f7ed4f976.slice/crio-826a155f3585d3dd6e810ac5e1cb44a19ccf64c3f535b735768f3540aa2b91a0 WatchSource:0}: Error finding container 826a155f3585d3dd6e810ac5e1cb44a19ccf64c3f535b735768f3540aa2b91a0: Status 404 returned error can't find the container with id 826a155f3585d3dd6e810ac5e1cb44a19ccf64c3f535b735768f3540aa2b91a0 Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.656433 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.709354 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.797279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.865037 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.980277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" event={"ID":"2858d6db-ce93-467d-8eba-54579431e3ae","Type":"ContainerStarted","Data":"ba48b9ad053132f2bf6ac15e8eece271734c4f6298c9c277914a6b452cfc2957"} Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.980364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" event={"ID":"2858d6db-ce93-467d-8eba-54579431e3ae","Type":"ContainerStarted","Data":"d6ae9eb3a99b51d833e1c56b683e373910659ef75b386df01cf99d80cfacbf0a"} Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.980978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.983417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" event={"ID":"59539c85-68ad-4b62-8484-ccda9def3258","Type":"ContainerStarted","Data":"0699485e458e8bae02980ce958165b4a434e7e0afd26d9c56da80fa022e90aab"} Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.986463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" event={"ID":"1d530d0b-3ad3-416e-a600-997f7ed4f976","Type":"ContainerStarted","Data":"51a6d055056ecd15eae2417e4e9d76b0d405d79b90e1b2169bc48df09966d7dc"} Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.986496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" event={"ID":"1d530d0b-3ad3-416e-a600-997f7ed4f976","Type":"ContainerStarted","Data":"826a155f3585d3dd6e810ac5e1cb44a19ccf64c3f535b735768f3540aa2b91a0"} Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.987670 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:19 crc kubenswrapper[4749]: I0310 15:54:19.997285 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.003288 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.046322 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" podStartSLOduration=8.046291547 podStartE2EDuration="8.046291547s" podCreationTimestamp="2026-03-10 15:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:54:20.041492102 +0000 UTC m=+357.163357799" watchObservedRunningTime="2026-03-10 15:54:20.046291547 +0000 UTC m=+357.168157234" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.084299 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" podStartSLOduration=8.084273356 podStartE2EDuration="8.084273356s" podCreationTimestamp="2026-03-10 15:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:54:20.08018407 +0000 UTC m=+357.202049777" watchObservedRunningTime="2026-03-10 15:54:20.084273356 +0000 UTC m=+357.206139043" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.237862 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.256262 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.349712 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.478332 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:20 crc kubenswrapper[4749]: I0310 15:54:20.626336 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 15:54:21 crc kubenswrapper[4749]: I0310 15:54:21.001232 4749 generic.go:334] "Generic (PLEG): container finished" podID="59539c85-68ad-4b62-8484-ccda9def3258" containerID="8595c937dc76ff9e9c7d7c657e2f4293b7ea4e946274d580b3f51966df7734a8" exitCode=0 Mar 10 15:54:21 crc kubenswrapper[4749]: I0310 15:54:21.001339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" event={"ID":"59539c85-68ad-4b62-8484-ccda9def3258","Type":"ContainerDied","Data":"8595c937dc76ff9e9c7d7c657e2f4293b7ea4e946274d580b3f51966df7734a8"} Mar 10 15:54:21 crc kubenswrapper[4749]: I0310 15:54:21.073032 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.375303 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.391283 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.391471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.444994 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljzm4\" (UniqueName: \"kubernetes.io/projected/59539c85-68ad-4b62-8484-ccda9def3258-kube-api-access-ljzm4\") pod \"59539c85-68ad-4b62-8484-ccda9def3258\" (UID: \"59539c85-68ad-4b62-8484-ccda9def3258\") " Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445303 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445883 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445904 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.445989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.446035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.452606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59539c85-68ad-4b62-8484-ccda9def3258-kube-api-access-ljzm4" (OuterVolumeSpecName: "kube-api-access-ljzm4") pod "59539c85-68ad-4b62-8484-ccda9def3258" (UID: "59539c85-68ad-4b62-8484-ccda9def3258"). InnerVolumeSpecName "kube-api-access-ljzm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.455831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.547614 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljzm4\" (UniqueName: \"kubernetes.io/projected/59539c85-68ad-4b62-8484-ccda9def3258-kube-api-access-ljzm4\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.547656 4749 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.547666 4749 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:22 crc kubenswrapper[4749]: I0310 15:54:22.547676 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.016177 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.016235 4749 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b" exitCode=137 Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.016311 4749 scope.go:117] "RemoveContainer" containerID="6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.016493 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.020298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" event={"ID":"59539c85-68ad-4b62-8484-ccda9def3258","Type":"ContainerDied","Data":"0699485e458e8bae02980ce958165b4a434e7e0afd26d9c56da80fa022e90aab"} Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.020353 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0699485e458e8bae02980ce958165b4a434e7e0afd26d9c56da80fa022e90aab" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.020367 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-dv6cc" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.044688 4749 scope.go:117] "RemoveContainer" containerID="6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b" Mar 10 15:54:23 crc kubenswrapper[4749]: E0310 15:54:23.047192 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b\": container with ID starting with 6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b not found: ID does not exist" containerID="6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.047304 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b"} err="failed to get container status \"6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b\": rpc error: code = NotFound desc = could not find container \"6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b\": container with ID starting with 6fd6d0237c30e52996abaae1fa287b1d25e394c64352c708ddcbc48313df0e8b not found: ID does not exist" Mar 10 15:54:23 crc kubenswrapper[4749]: I0310 15:54:23.618758 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.264782 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9zb9"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.265827 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9zb9" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="registry-server" containerID="cri-o://b994cb03a82d4cfa66f289ce148006f3adcba94dc9e16fe082b58d16293c0d1c" gracePeriod=30 Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.273363 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.274244 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2v6nf" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="registry-server" containerID="cri-o://7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90" gracePeriod=30 Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.292630 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btwnr"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.293091 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-btwnr" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="registry-server" containerID="cri-o://346a95741f6eb6b00b842d6247b7872a73bf9f15bbd8e3175bb62ceba9162149" gracePeriod=30 Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.307126 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pgr7"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.307458 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerName="marketplace-operator" containerID="cri-o://50cc9b1123a57004b9fdc5913f93058cd08154562aa4ff9246ecef7ce3aea5ed" gracePeriod=30 Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.317499 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rg2tg"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.317809 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rg2tg" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="registry-server" containerID="cri-o://78ef571a6fca3e9288fe5770ce3e8751cae9a99e9e717c14b2253381d91777f1" gracePeriod=30 Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.328947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qnqn"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.331717 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9qnqn" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="registry-server" containerID="cri-o://ff560fa5eda1c0f64bbe79d1ccb3f7ac2c8481d3c5837920f56bce2af60a39d5" gracePeriod=30 Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.362786 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2mnc7"] Mar 10 15:54:26 crc kubenswrapper[4749]: E0310 15:54:26.364177 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59539c85-68ad-4b62-8484-ccda9def3258" containerName="oc" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.364201 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="59539c85-68ad-4b62-8484-ccda9def3258" containerName="oc" Mar 10 15:54:26 crc kubenswrapper[4749]: E0310 15:54:26.364223 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.364229 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.364468 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.364487 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="59539c85-68ad-4b62-8484-ccda9def3258" containerName="oc" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.376946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.382785 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2mnc7"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.403104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwdg\" (UniqueName: \"kubernetes.io/projected/6904f9b8-adbe-426e-9021-0da77d658ad6-kube-api-access-zbwdg\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.403159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6904f9b8-adbe-426e-9021-0da77d658ad6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.403412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6904f9b8-adbe-426e-9021-0da77d658ad6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.505243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwdg\" (UniqueName: \"kubernetes.io/projected/6904f9b8-adbe-426e-9021-0da77d658ad6-kube-api-access-zbwdg\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.505464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6904f9b8-adbe-426e-9021-0da77d658ad6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.505514 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6904f9b8-adbe-426e-9021-0da77d658ad6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.507039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6904f9b8-adbe-426e-9021-0da77d658ad6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.515280 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6904f9b8-adbe-426e-9021-0da77d658ad6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.529618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwdg\" (UniqueName: \"kubernetes.io/projected/6904f9b8-adbe-426e-9021-0da77d658ad6-kube-api-access-zbwdg\") pod \"marketplace-operator-79b997595-2mnc7\" (UID: \"6904f9b8-adbe-426e-9021-0da77d658ad6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.772246 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.880528 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.911315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bb42\" (UniqueName: \"kubernetes.io/projected/c56f09b3-981c-4a01-8ea3-4417c239cea6-kube-api-access-6bb42\") pod \"c56f09b3-981c-4a01-8ea3-4417c239cea6\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.911967 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-catalog-content\") pod \"c56f09b3-981c-4a01-8ea3-4417c239cea6\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.912074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-utilities\") pod \"c56f09b3-981c-4a01-8ea3-4417c239cea6\" (UID: \"c56f09b3-981c-4a01-8ea3-4417c239cea6\") " Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.913419 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.917218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-utilities" (OuterVolumeSpecName: "utilities") pod "c56f09b3-981c-4a01-8ea3-4417c239cea6" (UID: "c56f09b3-981c-4a01-8ea3-4417c239cea6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:26 crc kubenswrapper[4749]: I0310 15:54:26.921302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56f09b3-981c-4a01-8ea3-4417c239cea6-kube-api-access-6bb42" (OuterVolumeSpecName: "kube-api-access-6bb42") pod "c56f09b3-981c-4a01-8ea3-4417c239cea6" (UID: "c56f09b3-981c-4a01-8ea3-4417c239cea6"). InnerVolumeSpecName "kube-api-access-6bb42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.014604 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bb42\" (UniqueName: \"kubernetes.io/projected/c56f09b3-981c-4a01-8ea3-4417c239cea6-kube-api-access-6bb42\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.014668 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.042984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c56f09b3-981c-4a01-8ea3-4417c239cea6" (UID: "c56f09b3-981c-4a01-8ea3-4417c239cea6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.061513 4749 generic.go:334] "Generic (PLEG): container finished" podID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerID="ff560fa5eda1c0f64bbe79d1ccb3f7ac2c8481d3c5837920f56bce2af60a39d5" exitCode=0 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.061585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerDied","Data":"ff560fa5eda1c0f64bbe79d1ccb3f7ac2c8481d3c5837920f56bce2af60a39d5"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.070735 4749 generic.go:334] "Generic (PLEG): container finished" podID="fea7768f-4827-4630-9169-8b44719ad779" containerID="b994cb03a82d4cfa66f289ce148006f3adcba94dc9e16fe082b58d16293c0d1c" exitCode=0 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.070841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerDied","Data":"b994cb03a82d4cfa66f289ce148006f3adcba94dc9e16fe082b58d16293c0d1c"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.084815 4749 generic.go:334] "Generic (PLEG): container finished" podID="c08511ac-9832-428c-be08-de0771ee5254" containerID="78ef571a6fca3e9288fe5770ce3e8751cae9a99e9e717c14b2253381d91777f1" exitCode=0 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.084947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerDied","Data":"78ef571a6fca3e9288fe5770ce3e8751cae9a99e9e717c14b2253381d91777f1"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.097044 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerID="50cc9b1123a57004b9fdc5913f93058cd08154562aa4ff9246ecef7ce3aea5ed" exitCode=0 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.097211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" event={"ID":"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b","Type":"ContainerDied","Data":"50cc9b1123a57004b9fdc5913f93058cd08154562aa4ff9246ecef7ce3aea5ed"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.103461 4749 generic.go:334] "Generic (PLEG): container finished" podID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerID="7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90" exitCode=0 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.103601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v6nf" event={"ID":"c56f09b3-981c-4a01-8ea3-4417c239cea6","Type":"ContainerDied","Data":"7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.103640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v6nf" event={"ID":"c56f09b3-981c-4a01-8ea3-4417c239cea6","Type":"ContainerDied","Data":"307f6797704577cad9e06040019163489a95e22e9f2fcb9f6aac772bf839f75c"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.103663 4749 scope.go:117] "RemoveContainer" containerID="7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.103942 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v6nf" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.115821 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56f09b3-981c-4a01-8ea3-4417c239cea6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.123021 4749 generic.go:334] "Generic (PLEG): container finished" podID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerID="346a95741f6eb6b00b842d6247b7872a73bf9f15bbd8e3175bb62ceba9162149" exitCode=0 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.123115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerDied","Data":"346a95741f6eb6b00b842d6247b7872a73bf9f15bbd8e3175bb62ceba9162149"} Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.130842 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.137421 4749 scope.go:117] "RemoveContainer" containerID="ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.148020 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.150352 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.161600 4749 scope.go:117] "RemoveContainer" containerID="93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.184268 4749 scope.go:117] "RemoveContainer" containerID="7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90" Mar 10 15:54:27 crc kubenswrapper[4749]: E0310 15:54:27.188387 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90\": container with ID starting with 7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90 not found: ID does not exist" containerID="7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.188420 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90"} err="failed to get container status \"7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90\": rpc error: code = NotFound desc = could not find container \"7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90\": container with ID starting with 7ac856826147df8a42ebc68641d70cdf778712158eab8bdb694ea0f384acae90 not found: ID does not exist" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.188443 4749 scope.go:117] "RemoveContainer" containerID="ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355" Mar 10 15:54:27 crc kubenswrapper[4749]: E0310 15:54:27.188894 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355\": container with ID starting with ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355 not found: ID does not exist" containerID="ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.188911 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355"} err="failed to get container status \"ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355\": rpc error: code = NotFound desc = could not find container \"ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355\": container with ID starting with ac65ee09df92788d62309aae04df24da0c3b7c350427a7e5216ca2c15421d355 not found: ID does not exist" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.188923 4749 scope.go:117] "RemoveContainer" containerID="93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7" Mar 10 15:54:27 crc kubenswrapper[4749]: E0310 15:54:27.190083 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7\": container with ID starting with 93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7 not found: ID does not exist" containerID="93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.190103 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7"} err="failed to get container status \"93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7\": rpc error: code = NotFound desc = could not find container \"93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7\": container with ID starting with 93ca7f9d523edfd4b9f72ca52820023e4f4ffc6a5b0d538524dd87189eb643c7 not found: ID does not exist" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.197608 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.211109 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.216319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4d2\" (UniqueName: \"kubernetes.io/projected/fea7768f-4827-4630-9169-8b44719ad779-kube-api-access-gr4d2\") pod \"fea7768f-4827-4630-9169-8b44719ad779\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.216412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h88qz\" (UniqueName: \"kubernetes.io/projected/a19adf76-af03-4d7f-8661-4d93c67fda2e-kube-api-access-h88qz\") pod \"a19adf76-af03-4d7f-8661-4d93c67fda2e\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.216470 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-utilities\") pod \"c08511ac-9832-428c-be08-de0771ee5254\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.217107 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-catalog-content\") pod \"a19adf76-af03-4d7f-8661-4d93c67fda2e\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.217141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-catalog-content\") pod \"c08511ac-9832-428c-be08-de0771ee5254\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.217195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-catalog-content\") pod \"fea7768f-4827-4630-9169-8b44719ad779\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.217252 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-utilities\") pod \"a19adf76-af03-4d7f-8661-4d93c67fda2e\" (UID: \"a19adf76-af03-4d7f-8661-4d93c67fda2e\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.217305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-utilities\") pod \"fea7768f-4827-4630-9169-8b44719ad779\" (UID: \"fea7768f-4827-4630-9169-8b44719ad779\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.217334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qrh\" (UniqueName: \"kubernetes.io/projected/c08511ac-9832-428c-be08-de0771ee5254-kube-api-access-p2qrh\") pod \"c08511ac-9832-428c-be08-de0771ee5254\" (UID: \"c08511ac-9832-428c-be08-de0771ee5254\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.220770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea7768f-4827-4630-9169-8b44719ad779-kube-api-access-gr4d2" (OuterVolumeSpecName: "kube-api-access-gr4d2") pod "fea7768f-4827-4630-9169-8b44719ad779" (UID: "fea7768f-4827-4630-9169-8b44719ad779"). InnerVolumeSpecName "kube-api-access-gr4d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.221512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-utilities" (OuterVolumeSpecName: "utilities") pod "fea7768f-4827-4630-9169-8b44719ad779" (UID: "fea7768f-4827-4630-9169-8b44719ad779"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.221634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-utilities" (OuterVolumeSpecName: "utilities") pod "a19adf76-af03-4d7f-8661-4d93c67fda2e" (UID: "a19adf76-af03-4d7f-8661-4d93c67fda2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.221893 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19adf76-af03-4d7f-8661-4d93c67fda2e-kube-api-access-h88qz" (OuterVolumeSpecName: "kube-api-access-h88qz") pod "a19adf76-af03-4d7f-8661-4d93c67fda2e" (UID: "a19adf76-af03-4d7f-8661-4d93c67fda2e"). InnerVolumeSpecName "kube-api-access-h88qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.222581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-utilities" (OuterVolumeSpecName: "utilities") pod "c08511ac-9832-428c-be08-de0771ee5254" (UID: "c08511ac-9832-428c-be08-de0771ee5254"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.224961 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2v6nf"] Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.227293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08511ac-9832-428c-be08-de0771ee5254-kube-api-access-p2qrh" (OuterVolumeSpecName: "kube-api-access-p2qrh") pod "c08511ac-9832-428c-be08-de0771ee5254" (UID: "c08511ac-9832-428c-be08-de0771ee5254"). InnerVolumeSpecName "kube-api-access-p2qrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.241318 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2mnc7"] Mar 10 15:54:27 crc kubenswrapper[4749]: W0310 15:54:27.247487 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6904f9b8_adbe_426e_9021_0da77d658ad6.slice/crio-0a88522a3e59e92dfea2fb64a8ab4221c2b22e2fa1067ea6517784bf6bdd9de3 WatchSource:0}: Error finding container 0a88522a3e59e92dfea2fb64a8ab4221c2b22e2fa1067ea6517784bf6bdd9de3: Status 404 returned error can't find the container with id 0a88522a3e59e92dfea2fb64a8ab4221c2b22e2fa1067ea6517784bf6bdd9de3 Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.260969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c08511ac-9832-428c-be08-de0771ee5254" (UID: "c08511ac-9832-428c-be08-de0771ee5254"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.266252 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319435 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-trusted-ca\") pod \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kckv\" (UniqueName: \"kubernetes.io/projected/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-kube-api-access-4kckv\") pod \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-operator-metrics\") pod \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\" (UID: \"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp9kj\" (UniqueName: \"kubernetes.io/projected/3484c369-2c9c-48d5-b7be-9dadf06d09ca-kube-api-access-vp9kj\") pod \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-utilities\") pod \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-catalog-content\") pod \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\" (UID: \"3484c369-2c9c-48d5-b7be-9dadf06d09ca\") " Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319961 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319974 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qrh\" (UniqueName: \"kubernetes.io/projected/c08511ac-9832-428c-be08-de0771ee5254-kube-api-access-p2qrh\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319984 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4d2\" (UniqueName: \"kubernetes.io/projected/fea7768f-4827-4630-9169-8b44719ad779-kube-api-access-gr4d2\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.319993 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h88qz\" (UniqueName: \"kubernetes.io/projected/a19adf76-af03-4d7f-8661-4d93c67fda2e-kube-api-access-h88qz\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.320002 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.320012 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c08511ac-9832-428c-be08-de0771ee5254-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.320025 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.321343 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" (UID: "4fd422ad-ee9e-46ec-ae29-a9cec3a7129b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.327720 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-utilities" (OuterVolumeSpecName: "utilities") pod "3484c369-2c9c-48d5-b7be-9dadf06d09ca" (UID: "3484c369-2c9c-48d5-b7be-9dadf06d09ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.327998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3484c369-2c9c-48d5-b7be-9dadf06d09ca-kube-api-access-vp9kj" (OuterVolumeSpecName: "kube-api-access-vp9kj") pod "3484c369-2c9c-48d5-b7be-9dadf06d09ca" (UID: "3484c369-2c9c-48d5-b7be-9dadf06d09ca"). InnerVolumeSpecName "kube-api-access-vp9kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.340892 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-kube-api-access-4kckv" (OuterVolumeSpecName: "kube-api-access-4kckv") pod "4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" (UID: "4fd422ad-ee9e-46ec-ae29-a9cec3a7129b"). InnerVolumeSpecName "kube-api-access-4kckv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.340928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fea7768f-4827-4630-9169-8b44719ad779" (UID: "fea7768f-4827-4630-9169-8b44719ad779"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.341499 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" (UID: "4fd422ad-ee9e-46ec-ae29-a9cec3a7129b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.345842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a19adf76-af03-4d7f-8661-4d93c67fda2e" (UID: "a19adf76-af03-4d7f-8661-4d93c67fda2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424277 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424321 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp9kj\" (UniqueName: \"kubernetes.io/projected/3484c369-2c9c-48d5-b7be-9dadf06d09ca-kube-api-access-vp9kj\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424335 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19adf76-af03-4d7f-8661-4d93c67fda2e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424350 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424365 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea7768f-4827-4630-9169-8b44719ad779-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424403 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.424416 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kckv\" (UniqueName: \"kubernetes.io/projected/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b-kube-api-access-4kckv\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.490102 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3484c369-2c9c-48d5-b7be-9dadf06d09ca" (UID: "3484c369-2c9c-48d5-b7be-9dadf06d09ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.525921 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3484c369-2c9c-48d5-b7be-9dadf06d09ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:27 crc kubenswrapper[4749]: I0310 15:54:27.614896 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" path="/var/lib/kubelet/pods/c56f09b3-981c-4a01-8ea3-4417c239cea6/volumes" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.133257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" event={"ID":"6904f9b8-adbe-426e-9021-0da77d658ad6","Type":"ContainerStarted","Data":"ecbb3d34c59791fd6c16a9e66bd05c1589bd8cea7a2480a1033d43a6d6aa49f8"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.133698 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.133720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" event={"ID":"6904f9b8-adbe-426e-9021-0da77d658ad6","Type":"ContainerStarted","Data":"0a88522a3e59e92dfea2fb64a8ab4221c2b22e2fa1067ea6517784bf6bdd9de3"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.136369 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.136705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qnqn" event={"ID":"3484c369-2c9c-48d5-b7be-9dadf06d09ca","Type":"ContainerDied","Data":"c6aa59165035d6d65502e04a99b34ef8a876c78b9402ecb64ab008921a80432a"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.136757 4749 scope.go:117] "RemoveContainer" containerID="ff560fa5eda1c0f64bbe79d1ccb3f7ac2c8481d3c5837920f56bce2af60a39d5" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.136930 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qnqn" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.144566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9zb9" event={"ID":"fea7768f-4827-4630-9169-8b44719ad779","Type":"ContainerDied","Data":"30f71e3938620f6cc8389c6c836c11999dfc93b9480b095702fa466cec67664a"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.144779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9zb9" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.152296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rg2tg" event={"ID":"c08511ac-9832-428c-be08-de0771ee5254","Type":"ContainerDied","Data":"63fb4fcd242ade22023453c0858d10aa357885be86291acb550ceeb7b15342ff"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.152480 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rg2tg" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.164328 4749 scope.go:117] "RemoveContainer" containerID="1df99cd0bf285de11b8c76bf66c56faf36f31fde5f215a7d78cb3969d66ca427" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.164585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" event={"ID":"4fd422ad-ee9e-46ec-ae29-a9cec3a7129b","Type":"ContainerDied","Data":"cc2d468636c47b556f7dd10089a884822439b8b70929106804ddc0df12a8e8bc"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.164634 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pgr7" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.172569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btwnr" event={"ID":"a19adf76-af03-4d7f-8661-4d93c67fda2e","Type":"ContainerDied","Data":"5f1fc77525173b361f9f0ee26bce57c6ae5d6483a8cc9b5a7d016c88abffdcc2"} Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.172711 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btwnr" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.185197 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2mnc7" podStartSLOduration=2.185168275 podStartE2EDuration="2.185168275s" podCreationTimestamp="2026-03-10 15:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:54:28.162046019 +0000 UTC m=+365.283911716" watchObservedRunningTime="2026-03-10 15:54:28.185168275 +0000 UTC m=+365.307033992" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.186238 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9zb9"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.196603 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9zb9"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.196714 4749 scope.go:117] "RemoveContainer" containerID="6ea8c2fd2f0f20d37c052a4d7005dfa911fc49f83e8c97bd4f665df6b2376216" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.223504 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qnqn"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.235514 4749 scope.go:117] "RemoveContainer" containerID="b994cb03a82d4cfa66f289ce148006f3adcba94dc9e16fe082b58d16293c0d1c" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.237081 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9qnqn"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.249639 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btwnr"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.260502 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-btwnr"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.261539 4749 scope.go:117] "RemoveContainer" containerID="a1f1dbb53913431fa45ad2b611f6ddc746262d2610464e9882e2d1d8f7ee9de2" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.269414 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rg2tg"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.273900 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rg2tg"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.278572 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pgr7"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.278615 4749 scope.go:117] "RemoveContainer" containerID="2c92c4720be40da5d37886140c6df5544135f1414197d7ffc778057bb0be3db9" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.282198 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pgr7"] Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.297794 4749 scope.go:117] "RemoveContainer" containerID="78ef571a6fca3e9288fe5770ce3e8751cae9a99e9e717c14b2253381d91777f1" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.319619 4749 scope.go:117] "RemoveContainer" containerID="dc04157356b58916cc79bd9f71b43abe27a69fc6eabca519d0226abce170be29" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.336983 4749 scope.go:117] "RemoveContainer" containerID="61411f915edeb23d34444958efb7feb36f65875f9b942f19a5f7da6c84a6799a" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.353762 4749 scope.go:117] "RemoveContainer" containerID="50cc9b1123a57004b9fdc5913f93058cd08154562aa4ff9246ecef7ce3aea5ed" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.368258 4749 scope.go:117] "RemoveContainer" containerID="346a95741f6eb6b00b842d6247b7872a73bf9f15bbd8e3175bb62ceba9162149" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.385308 4749 scope.go:117] "RemoveContainer" containerID="e61340a77749010fde4d7b1b845714b9993675ba75f8bb4749d9c77d5ca1b7ac" Mar 10 15:54:28 crc kubenswrapper[4749]: I0310 15:54:28.408422 4749 scope.go:117] "RemoveContainer" containerID="c175286ddd09c72b26708698155d7481685999df6138d662ea4bc490018c3434" Mar 10 15:54:29 crc kubenswrapper[4749]: I0310 15:54:29.613888 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" path="/var/lib/kubelet/pods/3484c369-2c9c-48d5-b7be-9dadf06d09ca/volumes" Mar 10 15:54:29 crc kubenswrapper[4749]: I0310 15:54:29.614564 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" path="/var/lib/kubelet/pods/4fd422ad-ee9e-46ec-ae29-a9cec3a7129b/volumes" Mar 10 15:54:29 crc kubenswrapper[4749]: I0310 15:54:29.615007 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" path="/var/lib/kubelet/pods/a19adf76-af03-4d7f-8661-4d93c67fda2e/volumes" Mar 10 15:54:29 crc kubenswrapper[4749]: I0310 15:54:29.615636 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08511ac-9832-428c-be08-de0771ee5254" path="/var/lib/kubelet/pods/c08511ac-9832-428c-be08-de0771ee5254/volumes" Mar 10 15:54:29 crc kubenswrapper[4749]: I0310 15:54:29.616208 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea7768f-4827-4630-9169-8b44719ad779" path="/var/lib/kubelet/pods/fea7768f-4827-4630-9169-8b44719ad779/volumes" Mar 10 15:54:31 crc kubenswrapper[4749]: I0310 15:54:31.757527 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-9rrnn"] Mar 10 15:54:31 crc kubenswrapper[4749]: I0310 15:54:31.758198 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" podUID="1d530d0b-3ad3-416e-a600-997f7ed4f976" containerName="controller-manager" containerID="cri-o://51a6d055056ecd15eae2417e4e9d76b0d405d79b90e1b2169bc48df09966d7dc" gracePeriod=30 Mar 10 15:54:31 crc kubenswrapper[4749]: I0310 15:54:31.773806 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq"] Mar 10 15:54:31 crc kubenswrapper[4749]: I0310 15:54:31.774125 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" podUID="2858d6db-ce93-467d-8eba-54579431e3ae" containerName="route-controller-manager" containerID="cri-o://ba48b9ad053132f2bf6ac15e8eece271734c4f6298c9c277914a6b452cfc2957" gracePeriod=30 Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.207359 4749 generic.go:334] "Generic (PLEG): container finished" podID="2858d6db-ce93-467d-8eba-54579431e3ae" containerID="ba48b9ad053132f2bf6ac15e8eece271734c4f6298c9c277914a6b452cfc2957" exitCode=0 Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.207427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" event={"ID":"2858d6db-ce93-467d-8eba-54579431e3ae","Type":"ContainerDied","Data":"ba48b9ad053132f2bf6ac15e8eece271734c4f6298c9c277914a6b452cfc2957"} Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.209075 4749 generic.go:334] "Generic (PLEG): container finished" podID="1d530d0b-3ad3-416e-a600-997f7ed4f976" containerID="51a6d055056ecd15eae2417e4e9d76b0d405d79b90e1b2169bc48df09966d7dc" exitCode=0 Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.209105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" event={"ID":"1d530d0b-3ad3-416e-a600-997f7ed4f976","Type":"ContainerDied","Data":"51a6d055056ecd15eae2417e4e9d76b0d405d79b90e1b2169bc48df09966d7dc"} Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.383595 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.466168 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.504253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2858d6db-ce93-467d-8eba-54579431e3ae-serving-cert\") pod \"2858d6db-ce93-467d-8eba-54579431e3ae\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.504313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-config\") pod \"1d530d0b-3ad3-416e-a600-997f7ed4f976\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.504364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-proxy-ca-bundles\") pod \"1d530d0b-3ad3-416e-a600-997f7ed4f976\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.504436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-client-ca\") pod \"1d530d0b-3ad3-416e-a600-997f7ed4f976\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-config" (OuterVolumeSpecName: "config") pod "1d530d0b-3ad3-416e-a600-997f7ed4f976" (UID: "1d530d0b-3ad3-416e-a600-997f7ed4f976"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505393 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1d530d0b-3ad3-416e-a600-997f7ed4f976" (UID: "1d530d0b-3ad3-416e-a600-997f7ed4f976"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505451 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b8f5\" (UniqueName: \"kubernetes.io/projected/2858d6db-ce93-467d-8eba-54579431e3ae-kube-api-access-4b8f5\") pod \"2858d6db-ce93-467d-8eba-54579431e3ae\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-client-ca\") pod \"2858d6db-ce93-467d-8eba-54579431e3ae\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l49qm\" (UniqueName: \"kubernetes.io/projected/1d530d0b-3ad3-416e-a600-997f7ed4f976-kube-api-access-l49qm\") pod \"1d530d0b-3ad3-416e-a600-997f7ed4f976\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d530d0b-3ad3-416e-a600-997f7ed4f976-serving-cert\") pod \"1d530d0b-3ad3-416e-a600-997f7ed4f976\" (UID: \"1d530d0b-3ad3-416e-a600-997f7ed4f976\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-config\") pod \"2858d6db-ce93-467d-8eba-54579431e3ae\" (UID: \"2858d6db-ce93-467d-8eba-54579431e3ae\") " Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505992 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.506015 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.505527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d530d0b-3ad3-416e-a600-997f7ed4f976" (UID: "1d530d0b-3ad3-416e-a600-997f7ed4f976"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.506758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "2858d6db-ce93-467d-8eba-54579431e3ae" (UID: "2858d6db-ce93-467d-8eba-54579431e3ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.506747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-config" (OuterVolumeSpecName: "config") pod "2858d6db-ce93-467d-8eba-54579431e3ae" (UID: "2858d6db-ce93-467d-8eba-54579431e3ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.512133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d530d0b-3ad3-416e-a600-997f7ed4f976-kube-api-access-l49qm" (OuterVolumeSpecName: "kube-api-access-l49qm") pod "1d530d0b-3ad3-416e-a600-997f7ed4f976" (UID: "1d530d0b-3ad3-416e-a600-997f7ed4f976"). InnerVolumeSpecName "kube-api-access-l49qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.512168 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2858d6db-ce93-467d-8eba-54579431e3ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2858d6db-ce93-467d-8eba-54579431e3ae" (UID: "2858d6db-ce93-467d-8eba-54579431e3ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.512452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d530d0b-3ad3-416e-a600-997f7ed4f976-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d530d0b-3ad3-416e-a600-997f7ed4f976" (UID: "1d530d0b-3ad3-416e-a600-997f7ed4f976"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.513132 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2858d6db-ce93-467d-8eba-54579431e3ae-kube-api-access-4b8f5" (OuterVolumeSpecName: "kube-api-access-4b8f5") pod "2858d6db-ce93-467d-8eba-54579431e3ae" (UID: "2858d6db-ce93-467d-8eba-54579431e3ae"). InnerVolumeSpecName "kube-api-access-4b8f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607531 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607596 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2858d6db-ce93-467d-8eba-54579431e3ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607612 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d530d0b-3ad3-416e-a600-997f7ed4f976-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607627 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b8f5\" (UniqueName: \"kubernetes.io/projected/2858d6db-ce93-467d-8eba-54579431e3ae-kube-api-access-4b8f5\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607641 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2858d6db-ce93-467d-8eba-54579431e3ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607658 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l49qm\" (UniqueName: \"kubernetes.io/projected/1d530d0b-3ad3-416e-a600-997f7ed4f976-kube-api-access-l49qm\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:32 crc kubenswrapper[4749]: I0310 15:54:32.607671 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d530d0b-3ad3-416e-a600-997f7ed4f976-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.167018 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.228460 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-598844dbc9-6f56f"] Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.228844 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.228874 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.228895 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerName="marketplace-operator" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.228909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerName="marketplace-operator" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.228928 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.228939 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.228955 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229002 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229015 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229025 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229040 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229049 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229063 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229073 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229084 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229092 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="extract-content" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229102 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229110 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229123 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229133 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229143 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2858d6db-ce93-467d-8eba-54579431e3ae" containerName="route-controller-manager" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229153 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2858d6db-ce93-467d-8eba-54579431e3ae" containerName="route-controller-manager" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229170 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d530d0b-3ad3-416e-a600-997f7ed4f976" containerName="controller-manager" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229178 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d530d0b-3ad3-416e-a600-997f7ed4f976" containerName="controller-manager" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229199 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229212 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229221 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229233 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229242 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229253 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229261 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229272 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229280 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: E0310 15:54:33.229290 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229298 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="extract-utilities" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229428 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229461 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d530d0b-3ad3-416e-a600-997f7ed4f976" containerName="controller-manager" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229482 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd422ad-ee9e-46ec-ae29-a9cec3a7129b" containerName="marketplace-operator" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229495 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56f09b3-981c-4a01-8ea3-4417c239cea6" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229517 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19adf76-af03-4d7f-8661-4d93c67fda2e" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229531 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2858d6db-ce93-467d-8eba-54579431e3ae" containerName="route-controller-manager" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229550 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea7768f-4827-4630-9169-8b44719ad779" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229573 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08511ac-9832-428c-be08-de0771ee5254" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.229606 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3484c369-2c9c-48d5-b7be-9dadf06d09ca" containerName="registry-server" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.230113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c8b7b957-9rrnn" event={"ID":"1d530d0b-3ad3-416e-a600-997f7ed4f976","Type":"ContainerDied","Data":"826a155f3585d3dd6e810ac5e1cb44a19ccf64c3f535b735768f3540aa2b91a0"} Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.230181 4749 scope.go:117] "RemoveContainer" containerID="51a6d055056ecd15eae2417e4e9d76b0d405d79b90e1b2169bc48df09966d7dc" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.230455 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.232240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" event={"ID":"2858d6db-ce93-467d-8eba-54579431e3ae","Type":"ContainerDied","Data":"d6ae9eb3a99b51d833e1c56b683e373910659ef75b386df01cf99d80cfacbf0a"} Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.232329 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.236230 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.238463 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.239859 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.240280 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.240663 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.241120 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.241329 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.241480 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.241901 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.247867 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.248035 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.248067 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.248164 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.248625 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.250369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-598844dbc9-6f56f"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.254197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.254489 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.259335 4749 scope.go:117] "RemoveContainer" containerID="ba48b9ad053132f2bf6ac15e8eece271734c4f6298c9c277914a6b452cfc2957" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bps\" (UniqueName: \"kubernetes.io/projected/500569ac-f2ab-4182-80a3-73d821946d15-kube-api-access-d7bps\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe9eb98-c991-43c4-9249-fb429a0ed84d-serving-cert\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-client-ca\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-client-ca\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-config\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-config\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500569ac-f2ab-4182-80a3-73d821946d15-serving-cert\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-proxy-ca-bundles\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.318689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9b4b\" (UniqueName: \"kubernetes.io/projected/4fe9eb98-c991-43c4-9249-fb429a0ed84d-kube-api-access-j9b4b\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.322383 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.326447 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-pgnbq"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.332755 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-9rrnn"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.337580 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-9rrnn"] Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-client-ca\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-config\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-config\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500569ac-f2ab-4182-80a3-73d821946d15-serving-cert\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-proxy-ca-bundles\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9b4b\" (UniqueName: \"kubernetes.io/projected/4fe9eb98-c991-43c4-9249-fb429a0ed84d-kube-api-access-j9b4b\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bps\" (UniqueName: \"kubernetes.io/projected/500569ac-f2ab-4182-80a3-73d821946d15-kube-api-access-d7bps\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe9eb98-c991-43c4-9249-fb429a0ed84d-serving-cert\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.420887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-client-ca\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.422232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-client-ca\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.423257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-config\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.423277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-proxy-ca-bundles\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.423921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-client-ca\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.425662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-config\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.427822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe9eb98-c991-43c4-9249-fb429a0ed84d-serving-cert\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.428742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500569ac-f2ab-4182-80a3-73d821946d15-serving-cert\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.447877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9b4b\" (UniqueName: \"kubernetes.io/projected/4fe9eb98-c991-43c4-9249-fb429a0ed84d-kube-api-access-j9b4b\") pod \"route-controller-manager-65b45d8f5-8gq9p\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.450068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bps\" (UniqueName: \"kubernetes.io/projected/500569ac-f2ab-4182-80a3-73d821946d15-kube-api-access-d7bps\") pod \"controller-manager-598844dbc9-6f56f\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.562998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.599299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.616822 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d530d0b-3ad3-416e-a600-997f7ed4f976" path="/var/lib/kubelet/pods/1d530d0b-3ad3-416e-a600-997f7ed4f976/volumes" Mar 10 15:54:33 crc kubenswrapper[4749]: I0310 15:54:33.617826 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2858d6db-ce93-467d-8eba-54579431e3ae" path="/var/lib/kubelet/pods/2858d6db-ce93-467d-8eba-54579431e3ae/volumes" Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.023050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.038120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-598844dbc9-6f56f"] Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.101061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p"] Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.243247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" event={"ID":"4fe9eb98-c991-43c4-9249-fb429a0ed84d","Type":"ContainerStarted","Data":"868c58713f09391ecdbc6984a9492ed829f5aec214b406f0401fa3a1d6e5a1cc"} Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.250913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" event={"ID":"500569ac-f2ab-4182-80a3-73d821946d15","Type":"ContainerStarted","Data":"5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0"} Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.250967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" event={"ID":"500569ac-f2ab-4182-80a3-73d821946d15","Type":"ContainerStarted","Data":"4522a1a40abceca350f44be8b4f706a147814cff34b6abb5e854bebbf9a1a4e3"} Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.251209 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.254647 4749 patch_prober.go:28] interesting pod/controller-manager-598844dbc9-6f56f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.254718 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" podUID="500569ac-f2ab-4182-80a3-73d821946d15" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 10 15:54:34 crc kubenswrapper[4749]: I0310 15:54:34.271026 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" podStartSLOduration=3.271001359 podStartE2EDuration="3.271001359s" podCreationTimestamp="2026-03-10 15:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:54:34.26960914 +0000 UTC m=+371.391474837" watchObservedRunningTime="2026-03-10 15:54:34.271001359 +0000 UTC m=+371.392867046" Mar 10 15:54:35 crc kubenswrapper[4749]: I0310 15:54:35.262943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" event={"ID":"4fe9eb98-c991-43c4-9249-fb429a0ed84d","Type":"ContainerStarted","Data":"ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3"} Mar 10 15:54:35 crc kubenswrapper[4749]: I0310 15:54:35.269243 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:54:35 crc kubenswrapper[4749]: I0310 15:54:35.313312 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" podStartSLOduration=4.313276217 podStartE2EDuration="4.313276217s" podCreationTimestamp="2026-03-10 15:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:54:35.289812301 +0000 UTC m=+372.411677988" watchObservedRunningTime="2026-03-10 15:54:35.313276217 +0000 UTC m=+372.435141904" Mar 10 15:54:36 crc kubenswrapper[4749]: I0310 15:54:36.268897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:36 crc kubenswrapper[4749]: I0310 15:54:36.274851 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:54:38 crc kubenswrapper[4749]: I0310 15:54:38.359812 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 15:54:39 crc kubenswrapper[4749]: I0310 15:54:39.133367 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:54:39 crc kubenswrapper[4749]: I0310 15:54:39.318621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 15:54:40 crc kubenswrapper[4749]: I0310 15:54:40.755565 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 15:54:42 crc kubenswrapper[4749]: I0310 15:54:42.410176 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:54:43 crc kubenswrapper[4749]: I0310 15:54:43.310414 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 15:54:43 crc kubenswrapper[4749]: I0310 15:54:43.709386 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 15:54:43 crc kubenswrapper[4749]: I0310 15:54:43.755844 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 15:54:44 crc kubenswrapper[4749]: I0310 15:54:44.670026 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:54:46 crc kubenswrapper[4749]: I0310 15:54:46.973848 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 15:54:48 crc kubenswrapper[4749]: I0310 15:54:48.977135 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 15:54:49 crc kubenswrapper[4749]: I0310 15:54:49.023640 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 15:54:50 crc kubenswrapper[4749]: I0310 15:54:50.686064 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 15:54:51 crc kubenswrapper[4749]: I0310 15:54:51.402739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 15:54:53 crc kubenswrapper[4749]: I0310 15:54:53.370292 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 15:54:53 crc kubenswrapper[4749]: I0310 15:54:53.664646 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 15:54:55 crc kubenswrapper[4749]: I0310 15:54:55.512928 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:54:55 crc kubenswrapper[4749]: I0310 15:54:55.872215 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 15:54:55 crc kubenswrapper[4749]: I0310 15:54:55.968583 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 15:54:59 crc kubenswrapper[4749]: I0310 15:54:59.533089 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 15:55:02 crc kubenswrapper[4749]: I0310 15:55:02.661524 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 15:55:11 crc kubenswrapper[4749]: I0310 15:55:11.762725 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p"] Mar 10 15:55:11 crc kubenswrapper[4749]: I0310 15:55:11.763680 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" podUID="4fe9eb98-c991-43c4-9249-fb429a0ed84d" containerName="route-controller-manager" containerID="cri-o://ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3" gracePeriod=30 Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.204878 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.334291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-config\") pod \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.334406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-client-ca\") pod \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.334476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9b4b\" (UniqueName: \"kubernetes.io/projected/4fe9eb98-c991-43c4-9249-fb429a0ed84d-kube-api-access-j9b4b\") pod \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.334565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe9eb98-c991-43c4-9249-fb429a0ed84d-serving-cert\") pod \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\" (UID: \"4fe9eb98-c991-43c4-9249-fb429a0ed84d\") " Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.336204 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4fe9eb98-c991-43c4-9249-fb429a0ed84d" (UID: "4fe9eb98-c991-43c4-9249-fb429a0ed84d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.336226 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-config" (OuterVolumeSpecName: "config") pod "4fe9eb98-c991-43c4-9249-fb429a0ed84d" (UID: "4fe9eb98-c991-43c4-9249-fb429a0ed84d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.343318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe9eb98-c991-43c4-9249-fb429a0ed84d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4fe9eb98-c991-43c4-9249-fb429a0ed84d" (UID: "4fe9eb98-c991-43c4-9249-fb429a0ed84d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.353169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe9eb98-c991-43c4-9249-fb429a0ed84d-kube-api-access-j9b4b" (OuterVolumeSpecName: "kube-api-access-j9b4b") pod "4fe9eb98-c991-43c4-9249-fb429a0ed84d" (UID: "4fe9eb98-c991-43c4-9249-fb429a0ed84d"). InnerVolumeSpecName "kube-api-access-j9b4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.436271 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.436340 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fe9eb98-c991-43c4-9249-fb429a0ed84d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.436403 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9b4b\" (UniqueName: \"kubernetes.io/projected/4fe9eb98-c991-43c4-9249-fb429a0ed84d-kube-api-access-j9b4b\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.436421 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe9eb98-c991-43c4-9249-fb429a0ed84d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.504595 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fe9eb98-c991-43c4-9249-fb429a0ed84d" containerID="ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3" exitCode=0 Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.504667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" event={"ID":"4fe9eb98-c991-43c4-9249-fb429a0ed84d","Type":"ContainerDied","Data":"ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3"} Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.504717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" event={"ID":"4fe9eb98-c991-43c4-9249-fb429a0ed84d","Type":"ContainerDied","Data":"868c58713f09391ecdbc6984a9492ed829f5aec214b406f0401fa3a1d6e5a1cc"} Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.504726 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.504740 4749 scope.go:117] "RemoveContainer" containerID="ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.528899 4749 scope.go:117] "RemoveContainer" containerID="ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3" Mar 10 15:55:12 crc kubenswrapper[4749]: E0310 15:55:12.529551 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3\": container with ID starting with ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3 not found: ID does not exist" containerID="ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.529628 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3"} err="failed to get container status \"ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3\": rpc error: code = NotFound desc = could not find container \"ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3\": container with ID starting with ca9dc21a51e8fbaac724524d721ffde6e29cdf61299844c6284d9e5da77367d3 not found: ID does not exist" Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.544654 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p"] Mar 10 15:55:12 crc kubenswrapper[4749]: I0310 15:55:12.548067 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b45d8f5-8gq9p"] Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.234799 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj"] Mar 10 15:55:13 crc kubenswrapper[4749]: E0310 15:55:13.235109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe9eb98-c991-43c4-9249-fb429a0ed84d" containerName="route-controller-manager" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.235128 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe9eb98-c991-43c4-9249-fb429a0ed84d" containerName="route-controller-manager" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.235239 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe9eb98-c991-43c4-9249-fb429a0ed84d" containerName="route-controller-manager" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.235800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.238917 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.239035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.239313 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.239330 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.239695 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.240881 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.256881 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj"] Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.348316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788ac2a5-f4c8-412c-809d-115431be5d53-config\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.348409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpgwr\" (UniqueName: \"kubernetes.io/projected/788ac2a5-f4c8-412c-809d-115431be5d53-kube-api-access-gpgwr\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.348485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/788ac2a5-f4c8-412c-809d-115431be5d53-serving-cert\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.348529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/788ac2a5-f4c8-412c-809d-115431be5d53-client-ca\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.450319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpgwr\" (UniqueName: \"kubernetes.io/projected/788ac2a5-f4c8-412c-809d-115431be5d53-kube-api-access-gpgwr\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.450413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/788ac2a5-f4c8-412c-809d-115431be5d53-serving-cert\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.450474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/788ac2a5-f4c8-412c-809d-115431be5d53-client-ca\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.450529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788ac2a5-f4c8-412c-809d-115431be5d53-config\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.452226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/788ac2a5-f4c8-412c-809d-115431be5d53-client-ca\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.452421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/788ac2a5-f4c8-412c-809d-115431be5d53-config\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.456902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/788ac2a5-f4c8-412c-809d-115431be5d53-serving-cert\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.467871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpgwr\" (UniqueName: \"kubernetes.io/projected/788ac2a5-f4c8-412c-809d-115431be5d53-kube-api-access-gpgwr\") pod \"route-controller-manager-7678dbd556-b6lgj\" (UID: \"788ac2a5-f4c8-412c-809d-115431be5d53\") " pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.558933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.616151 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe9eb98-c991-43c4-9249-fb429a0ed84d" path="/var/lib/kubelet/pods/4fe9eb98-c991-43c4-9249-fb429a0ed84d/volumes" Mar 10 15:55:13 crc kubenswrapper[4749]: I0310 15:55:13.967554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj"] Mar 10 15:55:14 crc kubenswrapper[4749]: I0310 15:55:14.519432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" event={"ID":"788ac2a5-f4c8-412c-809d-115431be5d53","Type":"ContainerStarted","Data":"308b83e66f735ae1c06fce994757e8005e6099f2878d1fc234ca7f6e8e803d5b"} Mar 10 15:55:14 crc kubenswrapper[4749]: I0310 15:55:14.519484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" event={"ID":"788ac2a5-f4c8-412c-809d-115431be5d53","Type":"ContainerStarted","Data":"90fd52a555f7c9df67288f3134da71a5b858f680a739261659507324fc0b7da1"} Mar 10 15:55:14 crc kubenswrapper[4749]: I0310 15:55:14.519848 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:14 crc kubenswrapper[4749]: I0310 15:55:14.529465 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" Mar 10 15:55:14 crc kubenswrapper[4749]: I0310 15:55:14.540536 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7678dbd556-b6lgj" podStartSLOduration=3.540507476 podStartE2EDuration="3.540507476s" podCreationTimestamp="2026-03-10 15:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:55:14.535521484 +0000 UTC m=+411.657387181" watchObservedRunningTime="2026-03-10 15:55:14.540507476 +0000 UTC m=+411.662373163" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.119912 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xvr84"] Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.122309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.126829 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvr84"] Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.127466 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.227908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-utilities\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.228018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj77c\" (UniqueName: \"kubernetes.io/projected/8294b189-cc7b-45fa-a350-d0fe5bd015ee-kube-api-access-kj77c\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.228060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-catalog-content\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.310873 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wxr96"] Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.312961 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.316525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.321053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxr96"] Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.329134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-utilities\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.329193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj77c\" (UniqueName: \"kubernetes.io/projected/8294b189-cc7b-45fa-a350-d0fe5bd015ee-kube-api-access-kj77c\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.329222 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-catalog-content\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.329766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-catalog-content\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.329794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-utilities\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.356040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj77c\" (UniqueName: \"kubernetes.io/projected/8294b189-cc7b-45fa-a350-d0fe5bd015ee-kube-api-access-kj77c\") pod \"redhat-operators-xvr84\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.430981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ae455d-1747-4883-b19d-3cbe4aa77dcd-catalog-content\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.431077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ae455d-1747-4883-b19d-3cbe4aa77dcd-utilities\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.431121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvqp\" (UniqueName: \"kubernetes.io/projected/61ae455d-1747-4883-b19d-3cbe4aa77dcd-kube-api-access-hfvqp\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.441250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.533103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ae455d-1747-4883-b19d-3cbe4aa77dcd-catalog-content\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.533189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ae455d-1747-4883-b19d-3cbe4aa77dcd-utilities\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.533226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvqp\" (UniqueName: \"kubernetes.io/projected/61ae455d-1747-4883-b19d-3cbe4aa77dcd-kube-api-access-hfvqp\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.534004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ae455d-1747-4883-b19d-3cbe4aa77dcd-catalog-content\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.534405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ae455d-1747-4883-b19d-3cbe4aa77dcd-utilities\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.561673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvqp\" (UniqueName: \"kubernetes.io/projected/61ae455d-1747-4883-b19d-3cbe4aa77dcd-kube-api-access-hfvqp\") pod \"certified-operators-wxr96\" (UID: \"61ae455d-1747-4883-b19d-3cbe4aa77dcd\") " pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.634053 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:26 crc kubenswrapper[4749]: I0310 15:55:26.875719 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvr84"] Mar 10 15:55:26 crc kubenswrapper[4749]: W0310 15:55:26.882034 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8294b189_cc7b_45fa_a350_d0fe5bd015ee.slice/crio-923e63d3e5532bd8d24656a0f4d5dcb053869ed76171708c38bfbe67af46cbd0 WatchSource:0}: Error finding container 923e63d3e5532bd8d24656a0f4d5dcb053869ed76171708c38bfbe67af46cbd0: Status 404 returned error can't find the container with id 923e63d3e5532bd8d24656a0f4d5dcb053869ed76171708c38bfbe67af46cbd0 Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.035366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxr96"] Mar 10 15:55:27 crc kubenswrapper[4749]: W0310 15:55:27.043022 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ae455d_1747_4883_b19d_3cbe4aa77dcd.slice/crio-e9f3c028bf0ffe1c53c84348239b6ae4275d9e38168deb132866bd7168d87257 WatchSource:0}: Error finding container e9f3c028bf0ffe1c53c84348239b6ae4275d9e38168deb132866bd7168d87257: Status 404 returned error can't find the container with id e9f3c028bf0ffe1c53c84348239b6ae4275d9e38168deb132866bd7168d87257 Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.597336 4749 generic.go:334] "Generic (PLEG): container finished" podID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerID="8f55318d965c0610ff6024a3687cc64d10ea059e8e1e84fc652960e753ef5434" exitCode=0 Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.597441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerDied","Data":"8f55318d965c0610ff6024a3687cc64d10ea059e8e1e84fc652960e753ef5434"} Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.597492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerStarted","Data":"923e63d3e5532bd8d24656a0f4d5dcb053869ed76171708c38bfbe67af46cbd0"} Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.600463 4749 generic.go:334] "Generic (PLEG): container finished" podID="61ae455d-1747-4883-b19d-3cbe4aa77dcd" containerID="06560b00beeea4809383e4b54f854c2f7c1fbde3c9b9cfd5a2bf7c600c2ab533" exitCode=0 Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.600521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxr96" event={"ID":"61ae455d-1747-4883-b19d-3cbe4aa77dcd","Type":"ContainerDied","Data":"06560b00beeea4809383e4b54f854c2f7c1fbde3c9b9cfd5a2bf7c600c2ab533"} Mar 10 15:55:27 crc kubenswrapper[4749]: I0310 15:55:27.600587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxr96" event={"ID":"61ae455d-1747-4883-b19d-3cbe4aa77dcd","Type":"ContainerStarted","Data":"e9f3c028bf0ffe1c53c84348239b6ae4275d9e38168deb132866bd7168d87257"} Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.502801 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7h9wh"] Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.504321 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.507714 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.518180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7h9wh"] Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.560398 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fd18d0-2c32-414f-a725-a54c904db468-catalog-content\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.560534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmn4r\" (UniqueName: \"kubernetes.io/projected/08fd18d0-2c32-414f-a725-a54c904db468-kube-api-access-pmn4r\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.560724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fd18d0-2c32-414f-a725-a54c904db468-utilities\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.609099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxr96" event={"ID":"61ae455d-1747-4883-b19d-3cbe4aa77dcd","Type":"ContainerStarted","Data":"4a2840b4c4163da28830b21f1eb386d15be1fd410e8b72dbf710d1902e1cb09d"} Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.661572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fd18d0-2c32-414f-a725-a54c904db468-catalog-content\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.661642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmn4r\" (UniqueName: \"kubernetes.io/projected/08fd18d0-2c32-414f-a725-a54c904db468-kube-api-access-pmn4r\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.661698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fd18d0-2c32-414f-a725-a54c904db468-utilities\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.662121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fd18d0-2c32-414f-a725-a54c904db468-catalog-content\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.662272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fd18d0-2c32-414f-a725-a54c904db468-utilities\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.679509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmn4r\" (UniqueName: \"kubernetes.io/projected/08fd18d0-2c32-414f-a725-a54c904db468-kube-api-access-pmn4r\") pod \"community-operators-7h9wh\" (UID: \"08fd18d0-2c32-414f-a725-a54c904db468\") " pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.716312 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pcmnl"] Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.718750 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.720852 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.734522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcmnl"] Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.824536 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.864713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb151eb3-5433-4e8c-a9ac-556a3172438a-utilities\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.864777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxqj\" (UniqueName: \"kubernetes.io/projected/fb151eb3-5433-4e8c-a9ac-556a3172438a-kube-api-access-4rxqj\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.864836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb151eb3-5433-4e8c-a9ac-556a3172438a-catalog-content\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.965671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb151eb3-5433-4e8c-a9ac-556a3172438a-utilities\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.966021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxqj\" (UniqueName: \"kubernetes.io/projected/fb151eb3-5433-4e8c-a9ac-556a3172438a-kube-api-access-4rxqj\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.966071 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb151eb3-5433-4e8c-a9ac-556a3172438a-catalog-content\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.966775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb151eb3-5433-4e8c-a9ac-556a3172438a-catalog-content\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.966894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb151eb3-5433-4e8c-a9ac-556a3172438a-utilities\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:28 crc kubenswrapper[4749]: I0310 15:55:28.985079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxqj\" (UniqueName: \"kubernetes.io/projected/fb151eb3-5433-4e8c-a9ac-556a3172438a-kube-api-access-4rxqj\") pod \"redhat-marketplace-pcmnl\" (UID: \"fb151eb3-5433-4e8c-a9ac-556a3172438a\") " pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.045976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.235914 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7h9wh"] Mar 10 15:55:29 crc kubenswrapper[4749]: W0310 15:55:29.255844 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08fd18d0_2c32_414f_a725_a54c904db468.slice/crio-388c64616a919e87d6cfd071f5d9706d3f4e672cc91c0d3a936e58e72a0a52ca WatchSource:0}: Error finding container 388c64616a919e87d6cfd071f5d9706d3f4e672cc91c0d3a936e58e72a0a52ca: Status 404 returned error can't find the container with id 388c64616a919e87d6cfd071f5d9706d3f4e672cc91c0d3a936e58e72a0a52ca Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.257780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcmnl"] Mar 10 15:55:29 crc kubenswrapper[4749]: W0310 15:55:29.269724 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb151eb3_5433_4e8c_a9ac_556a3172438a.slice/crio-f195d537ea038f7fa1927bbf9a085cfa17ee29149b903219622d3274544b2ba1 WatchSource:0}: Error finding container f195d537ea038f7fa1927bbf9a085cfa17ee29149b903219622d3274544b2ba1: Status 404 returned error can't find the container with id f195d537ea038f7fa1927bbf9a085cfa17ee29149b903219622d3274544b2ba1 Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.618157 4749 generic.go:334] "Generic (PLEG): container finished" podID="fb151eb3-5433-4e8c-a9ac-556a3172438a" containerID="fa15b163c52c690b31943ab46adf504d44046dbd2d3c2426df68ea284653214e" exitCode=0 Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.618318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcmnl" event={"ID":"fb151eb3-5433-4e8c-a9ac-556a3172438a","Type":"ContainerDied","Data":"fa15b163c52c690b31943ab46adf504d44046dbd2d3c2426df68ea284653214e"} Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.618400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcmnl" event={"ID":"fb151eb3-5433-4e8c-a9ac-556a3172438a","Type":"ContainerStarted","Data":"f195d537ea038f7fa1927bbf9a085cfa17ee29149b903219622d3274544b2ba1"} Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.620699 4749 generic.go:334] "Generic (PLEG): container finished" podID="08fd18d0-2c32-414f-a725-a54c904db468" containerID="01275aaf795676a5bfad9d7c9322c07f4512c7f5b516f18cb0f370bfd5d519c5" exitCode=0 Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.620741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h9wh" event={"ID":"08fd18d0-2c32-414f-a725-a54c904db468","Type":"ContainerDied","Data":"01275aaf795676a5bfad9d7c9322c07f4512c7f5b516f18cb0f370bfd5d519c5"} Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.620797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h9wh" event={"ID":"08fd18d0-2c32-414f-a725-a54c904db468","Type":"ContainerStarted","Data":"388c64616a919e87d6cfd071f5d9706d3f4e672cc91c0d3a936e58e72a0a52ca"} Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.623738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerStarted","Data":"a0966eedb53ad9e89bc2e8ef010d6b1f0bcfeb04f1d0bc2771c7f5964851a535"} Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.628279 4749 generic.go:334] "Generic (PLEG): container finished" podID="61ae455d-1747-4883-b19d-3cbe4aa77dcd" containerID="4a2840b4c4163da28830b21f1eb386d15be1fd410e8b72dbf710d1902e1cb09d" exitCode=0 Mar 10 15:55:29 crc kubenswrapper[4749]: I0310 15:55:29.628334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxr96" event={"ID":"61ae455d-1747-4883-b19d-3cbe4aa77dcd","Type":"ContainerDied","Data":"4a2840b4c4163da28830b21f1eb386d15be1fd410e8b72dbf710d1902e1cb09d"} Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.644756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h9wh" event={"ID":"08fd18d0-2c32-414f-a725-a54c904db468","Type":"ContainerStarted","Data":"ad76a47a15e57598bca4cb14445b85ba23b8d77de17bb47f2e4155e073f35eb6"} Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.648525 4749 generic.go:334] "Generic (PLEG): container finished" podID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerID="a0966eedb53ad9e89bc2e8ef010d6b1f0bcfeb04f1d0bc2771c7f5964851a535" exitCode=0 Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.648606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerDied","Data":"a0966eedb53ad9e89bc2e8ef010d6b1f0bcfeb04f1d0bc2771c7f5964851a535"} Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.654425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxr96" event={"ID":"61ae455d-1747-4883-b19d-3cbe4aa77dcd","Type":"ContainerStarted","Data":"a7909216e2d305ad9d5c707044f03e1febdd8bb1db78763a7f7fa11cbaaa1217"} Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.658477 4749 generic.go:334] "Generic (PLEG): container finished" podID="fb151eb3-5433-4e8c-a9ac-556a3172438a" containerID="e9b2494527d4a40402f2a1ee0334b84107a4f28b03f9c633c1f1c55cd6ca294c" exitCode=0 Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.658529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcmnl" event={"ID":"fb151eb3-5433-4e8c-a9ac-556a3172438a","Type":"ContainerDied","Data":"e9b2494527d4a40402f2a1ee0334b84107a4f28b03f9c633c1f1c55cd6ca294c"} Mar 10 15:55:30 crc kubenswrapper[4749]: I0310 15:55:30.745128 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wxr96" podStartSLOduration=2.249038404 podStartE2EDuration="4.745100475s" podCreationTimestamp="2026-03-10 15:55:26 +0000 UTC" firstStartedPulling="2026-03-10 15:55:27.602562814 +0000 UTC m=+424.724428511" lastFinishedPulling="2026-03-10 15:55:30.098624895 +0000 UTC m=+427.220490582" observedRunningTime="2026-03-10 15:55:30.741524172 +0000 UTC m=+427.863389869" watchObservedRunningTime="2026-03-10 15:55:30.745100475 +0000 UTC m=+427.866966162" Mar 10 15:55:31 crc kubenswrapper[4749]: I0310 15:55:31.667482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerStarted","Data":"f0f59ae3230b9784583d124f922c23640d2fe5f0820db0c175ea8c0ad97e738f"} Mar 10 15:55:31 crc kubenswrapper[4749]: I0310 15:55:31.669672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcmnl" event={"ID":"fb151eb3-5433-4e8c-a9ac-556a3172438a","Type":"ContainerStarted","Data":"d35ccca48b97b960240767860e31ddf38730fc3eae01a50fe8c56b3ea7ee121b"} Mar 10 15:55:31 crc kubenswrapper[4749]: I0310 15:55:31.673144 4749 generic.go:334] "Generic (PLEG): container finished" podID="08fd18d0-2c32-414f-a725-a54c904db468" containerID="ad76a47a15e57598bca4cb14445b85ba23b8d77de17bb47f2e4155e073f35eb6" exitCode=0 Mar 10 15:55:31 crc kubenswrapper[4749]: I0310 15:55:31.673925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h9wh" event={"ID":"08fd18d0-2c32-414f-a725-a54c904db468","Type":"ContainerDied","Data":"ad76a47a15e57598bca4cb14445b85ba23b8d77de17bb47f2e4155e073f35eb6"} Mar 10 15:55:31 crc kubenswrapper[4749]: I0310 15:55:31.689647 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xvr84" podStartSLOduration=2.237252997 podStartE2EDuration="5.689617177s" podCreationTimestamp="2026-03-10 15:55:26 +0000 UTC" firstStartedPulling="2026-03-10 15:55:27.600970118 +0000 UTC m=+424.722835805" lastFinishedPulling="2026-03-10 15:55:31.053334298 +0000 UTC m=+428.175199985" observedRunningTime="2026-03-10 15:55:31.688412372 +0000 UTC m=+428.810278059" watchObservedRunningTime="2026-03-10 15:55:31.689617177 +0000 UTC m=+428.811482884" Mar 10 15:55:31 crc kubenswrapper[4749]: I0310 15:55:31.708870 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pcmnl" podStartSLOduration=2.290624525 podStartE2EDuration="3.708849134s" podCreationTimestamp="2026-03-10 15:55:28 +0000 UTC" firstStartedPulling="2026-03-10 15:55:29.619985868 +0000 UTC m=+426.741851555" lastFinishedPulling="2026-03-10 15:55:31.038210477 +0000 UTC m=+428.160076164" observedRunningTime="2026-03-10 15:55:31.703828021 +0000 UTC m=+428.825693708" watchObservedRunningTime="2026-03-10 15:55:31.708849134 +0000 UTC m=+428.830714821" Mar 10 15:55:32 crc kubenswrapper[4749]: I0310 15:55:32.684647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h9wh" event={"ID":"08fd18d0-2c32-414f-a725-a54c904db468","Type":"ContainerStarted","Data":"21defeb529fc05f70c135a43ae88253c8b056758fdb6852403048244ab282f16"} Mar 10 15:55:32 crc kubenswrapper[4749]: I0310 15:55:32.710421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7h9wh" podStartSLOduration=2.243460403 podStartE2EDuration="4.710402057s" podCreationTimestamp="2026-03-10 15:55:28 +0000 UTC" firstStartedPulling="2026-03-10 15:55:29.622790708 +0000 UTC m=+426.744656395" lastFinishedPulling="2026-03-10 15:55:32.089732352 +0000 UTC m=+429.211598049" observedRunningTime="2026-03-10 15:55:32.70734269 +0000 UTC m=+429.829208377" watchObservedRunningTime="2026-03-10 15:55:32.710402057 +0000 UTC m=+429.832267744" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.382598 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xww7q"] Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.383763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.399541 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xww7q"] Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.442303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.442458 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.580230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2002b361-a4bd-43a9-a319-c42065011aeb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.580406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2002b361-a4bd-43a9-a319-c42065011aeb-trusted-ca\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.580481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2002b361-a4bd-43a9-a319-c42065011aeb-registry-certificates\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.580517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2002b361-a4bd-43a9-a319-c42065011aeb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.580553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-registry-tls\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.581073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8g2\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-kube-api-access-xl8g2\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.581168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.581218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-bound-sa-token\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.613495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.635002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.635072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.676712 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-bound-sa-token\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2002b361-a4bd-43a9-a319-c42065011aeb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2002b361-a4bd-43a9-a319-c42065011aeb-trusted-ca\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2002b361-a4bd-43a9-a319-c42065011aeb-registry-certificates\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2002b361-a4bd-43a9-a319-c42065011aeb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-registry-tls\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.683654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8g2\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-kube-api-access-xl8g2\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.684707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2002b361-a4bd-43a9-a319-c42065011aeb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.686002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2002b361-a4bd-43a9-a319-c42065011aeb-registry-certificates\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.687326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2002b361-a4bd-43a9-a319-c42065011aeb-trusted-ca\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.691618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-registry-tls\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.696438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2002b361-a4bd-43a9-a319-c42065011aeb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.713285 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-bound-sa-token\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.713466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8g2\" (UniqueName: \"kubernetes.io/projected/2002b361-a4bd-43a9-a319-c42065011aeb-kube-api-access-xl8g2\") pod \"image-registry-66df7c8f76-xww7q\" (UID: \"2002b361-a4bd-43a9-a319-c42065011aeb\") " pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:36 crc kubenswrapper[4749]: I0310 15:55:36.749832 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wxr96" Mar 10 15:55:37 crc kubenswrapper[4749]: I0310 15:55:37.000526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:37 crc kubenswrapper[4749]: I0310 15:55:37.446200 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xww7q"] Mar 10 15:55:37 crc kubenswrapper[4749]: I0310 15:55:37.491761 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xvr84" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="registry-server" probeResult="failure" output=< Mar 10 15:55:37 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 15:55:37 crc kubenswrapper[4749]: > Mar 10 15:55:37 crc kubenswrapper[4749]: I0310 15:55:37.717554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" event={"ID":"2002b361-a4bd-43a9-a319-c42065011aeb","Type":"ContainerStarted","Data":"00f6c564593581c8a80c791ab6793c6a9a912b6d7150a59b129a89f5d049ba23"} Mar 10 15:55:38 crc kubenswrapper[4749]: I0310 15:55:38.727151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" event={"ID":"2002b361-a4bd-43a9-a319-c42065011aeb","Type":"ContainerStarted","Data":"7dd7da5cebc94d09e0f3277bc04332deefed7c43b50c5950ab10b022102a6bc7"} Mar 10 15:55:38 crc kubenswrapper[4749]: I0310 15:55:38.825143 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:38 crc kubenswrapper[4749]: I0310 15:55:38.825532 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:38 crc kubenswrapper[4749]: I0310 15:55:38.878229 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:39 crc kubenswrapper[4749]: I0310 15:55:39.047445 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:39 crc kubenswrapper[4749]: I0310 15:55:39.047517 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:39 crc kubenswrapper[4749]: I0310 15:55:39.096254 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:39 crc kubenswrapper[4749]: I0310 15:55:39.754579 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" podStartSLOduration=3.754520299 podStartE2EDuration="3.754520299s" podCreationTimestamp="2026-03-10 15:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:55:39.754257521 +0000 UTC m=+436.876123228" watchObservedRunningTime="2026-03-10 15:55:39.754520299 +0000 UTC m=+436.876385996" Mar 10 15:55:39 crc kubenswrapper[4749]: I0310 15:55:39.781995 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7h9wh" Mar 10 15:55:39 crc kubenswrapper[4749]: I0310 15:55:39.783520 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pcmnl" Mar 10 15:55:46 crc kubenswrapper[4749]: I0310 15:55:46.487638 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:46 crc kubenswrapper[4749]: I0310 15:55:46.525703 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 15:55:47 crc kubenswrapper[4749]: I0310 15:55:47.001130 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:50 crc kubenswrapper[4749]: I0310 15:55:50.980527 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:55:50 crc kubenswrapper[4749]: I0310 15:55:50.981734 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:55:51 crc kubenswrapper[4749]: I0310 15:55:51.769973 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-598844dbc9-6f56f"] Mar 10 15:55:51 crc kubenswrapper[4749]: I0310 15:55:51.770757 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" podUID="500569ac-f2ab-4182-80a3-73d821946d15" containerName="controller-manager" containerID="cri-o://5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0" gracePeriod=30 Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.186812 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.319312 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-client-ca\") pod \"500569ac-f2ab-4182-80a3-73d821946d15\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.319461 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7bps\" (UniqueName: \"kubernetes.io/projected/500569ac-f2ab-4182-80a3-73d821946d15-kube-api-access-d7bps\") pod \"500569ac-f2ab-4182-80a3-73d821946d15\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.319553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500569ac-f2ab-4182-80a3-73d821946d15-serving-cert\") pod \"500569ac-f2ab-4182-80a3-73d821946d15\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.319604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-proxy-ca-bundles\") pod \"500569ac-f2ab-4182-80a3-73d821946d15\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.319658 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-config\") pod \"500569ac-f2ab-4182-80a3-73d821946d15\" (UID: \"500569ac-f2ab-4182-80a3-73d821946d15\") " Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.320142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-client-ca" (OuterVolumeSpecName: "client-ca") pod "500569ac-f2ab-4182-80a3-73d821946d15" (UID: "500569ac-f2ab-4182-80a3-73d821946d15"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.320541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "500569ac-f2ab-4182-80a3-73d821946d15" (UID: "500569ac-f2ab-4182-80a3-73d821946d15"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.320655 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-config" (OuterVolumeSpecName: "config") pod "500569ac-f2ab-4182-80a3-73d821946d15" (UID: "500569ac-f2ab-4182-80a3-73d821946d15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.326870 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500569ac-f2ab-4182-80a3-73d821946d15-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "500569ac-f2ab-4182-80a3-73d821946d15" (UID: "500569ac-f2ab-4182-80a3-73d821946d15"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.327248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500569ac-f2ab-4182-80a3-73d821946d15-kube-api-access-d7bps" (OuterVolumeSpecName: "kube-api-access-d7bps") pod "500569ac-f2ab-4182-80a3-73d821946d15" (UID: "500569ac-f2ab-4182-80a3-73d821946d15"). InnerVolumeSpecName "kube-api-access-d7bps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.421202 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7bps\" (UniqueName: \"kubernetes.io/projected/500569ac-f2ab-4182-80a3-73d821946d15-kube-api-access-d7bps\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.421254 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500569ac-f2ab-4182-80a3-73d821946d15-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.421264 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.421274 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.421284 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/500569ac-f2ab-4182-80a3-73d821946d15-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.829938 4749 generic.go:334] "Generic (PLEG): container finished" podID="500569ac-f2ab-4182-80a3-73d821946d15" containerID="5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0" exitCode=0 Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.830019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" event={"ID":"500569ac-f2ab-4182-80a3-73d821946d15","Type":"ContainerDied","Data":"5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0"} Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.830083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" event={"ID":"500569ac-f2ab-4182-80a3-73d821946d15","Type":"ContainerDied","Data":"4522a1a40abceca350f44be8b4f706a147814cff34b6abb5e854bebbf9a1a4e3"} Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.830109 4749 scope.go:117] "RemoveContainer" containerID="5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.830024 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598844dbc9-6f56f" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.856361 4749 scope.go:117] "RemoveContainer" containerID="5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0" Mar 10 15:55:52 crc kubenswrapper[4749]: E0310 15:55:52.858455 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0\": container with ID starting with 5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0 not found: ID does not exist" containerID="5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.858522 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0"} err="failed to get container status \"5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0\": rpc error: code = NotFound desc = could not find container \"5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0\": container with ID starting with 5429fad415ae51cc04d6f058a0bae3ba3d05228efc6703c6176862340482cbf0 not found: ID does not exist" Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.866751 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-598844dbc9-6f56f"] Mar 10 15:55:52 crc kubenswrapper[4749]: I0310 15:55:52.872408 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-598844dbc9-6f56f"] Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.260430 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-qtjq8"] Mar 10 15:55:53 crc kubenswrapper[4749]: E0310 15:55:53.260709 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500569ac-f2ab-4182-80a3-73d821946d15" containerName="controller-manager" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.260726 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="500569ac-f2ab-4182-80a3-73d821946d15" containerName="controller-manager" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.260877 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="500569ac-f2ab-4182-80a3-73d821946d15" containerName="controller-manager" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.261396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.263676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.266627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.266711 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.266627 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.266950 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.267439 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.274910 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.280748 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-qtjq8"] Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.438064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-config\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.438458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-proxy-ca-bundles\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.438591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szg5v\" (UniqueName: \"kubernetes.io/projected/4ee977e3-6713-42e7-a5dc-60541afd18d1-kube-api-access-szg5v\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.438690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee977e3-6713-42e7-a5dc-60541afd18d1-serving-cert\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.438787 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-client-ca\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.540152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-config\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.540271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-proxy-ca-bundles\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.540311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szg5v\" (UniqueName: \"kubernetes.io/projected/4ee977e3-6713-42e7-a5dc-60541afd18d1-kube-api-access-szg5v\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.540341 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee977e3-6713-42e7-a5dc-60541afd18d1-serving-cert\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.540391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-client-ca\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.541595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-client-ca\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.541764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-proxy-ca-bundles\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.542535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ee977e3-6713-42e7-a5dc-60541afd18d1-config\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.546362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee977e3-6713-42e7-a5dc-60541afd18d1-serving-cert\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.562097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szg5v\" (UniqueName: \"kubernetes.io/projected/4ee977e3-6713-42e7-a5dc-60541afd18d1-kube-api-access-szg5v\") pod \"controller-manager-67c8b7b957-qtjq8\" (UID: \"4ee977e3-6713-42e7-a5dc-60541afd18d1\") " pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.587173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:53 crc kubenswrapper[4749]: I0310 15:55:53.618466 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500569ac-f2ab-4182-80a3-73d821946d15" path="/var/lib/kubelet/pods/500569ac-f2ab-4182-80a3-73d821946d15/volumes" Mar 10 15:55:54 crc kubenswrapper[4749]: I0310 15:55:54.000788 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c8b7b957-qtjq8"] Mar 10 15:55:54 crc kubenswrapper[4749]: W0310 15:55:54.023249 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee977e3_6713_42e7_a5dc_60541afd18d1.slice/crio-6a12c78f2c56eab586bd1ec2ded2e133a8ce15a6db9c596b403747d89e174832 WatchSource:0}: Error finding container 6a12c78f2c56eab586bd1ec2ded2e133a8ce15a6db9c596b403747d89e174832: Status 404 returned error can't find the container with id 6a12c78f2c56eab586bd1ec2ded2e133a8ce15a6db9c596b403747d89e174832 Mar 10 15:55:54 crc kubenswrapper[4749]: I0310 15:55:54.845438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" event={"ID":"4ee977e3-6713-42e7-a5dc-60541afd18d1","Type":"ContainerStarted","Data":"8852b22df4f83d90cc1b69a8b21850c00b39d76d36310e1bbb1be9bfdfbd504a"} Mar 10 15:55:54 crc kubenswrapper[4749]: I0310 15:55:54.845914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" event={"ID":"4ee977e3-6713-42e7-a5dc-60541afd18d1","Type":"ContainerStarted","Data":"6a12c78f2c56eab586bd1ec2ded2e133a8ce15a6db9c596b403747d89e174832"} Mar 10 15:55:54 crc kubenswrapper[4749]: I0310 15:55:54.846067 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:54 crc kubenswrapper[4749]: I0310 15:55:54.852246 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" Mar 10 15:55:54 crc kubenswrapper[4749]: I0310 15:55:54.868250 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c8b7b957-qtjq8" podStartSLOduration=3.8682236100000003 podStartE2EDuration="3.86822361s" podCreationTimestamp="2026-03-10 15:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:55:54.865323619 +0000 UTC m=+451.987189326" watchObservedRunningTime="2026-03-10 15:55:54.86822361 +0000 UTC m=+451.990089297" Mar 10 15:55:57 crc kubenswrapper[4749]: I0310 15:55:57.009559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xww7q" Mar 10 15:55:57 crc kubenswrapper[4749]: I0310 15:55:57.063345 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2m4f"] Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.140181 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552636-dldtb"] Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.141588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.147046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.147046 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.148140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.153211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-dldtb"] Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.250624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdh2\" (UniqueName: \"kubernetes.io/projected/3d648991-7780-40b1-844f-d735838969c7-kube-api-access-fhdh2\") pod \"auto-csr-approver-29552636-dldtb\" (UID: \"3d648991-7780-40b1-844f-d735838969c7\") " pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.352427 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdh2\" (UniqueName: \"kubernetes.io/projected/3d648991-7780-40b1-844f-d735838969c7-kube-api-access-fhdh2\") pod \"auto-csr-approver-29552636-dldtb\" (UID: \"3d648991-7780-40b1-844f-d735838969c7\") " pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.373585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdh2\" (UniqueName: \"kubernetes.io/projected/3d648991-7780-40b1-844f-d735838969c7-kube-api-access-fhdh2\") pod \"auto-csr-approver-29552636-dldtb\" (UID: \"3d648991-7780-40b1-844f-d735838969c7\") " pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.463770 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:00 crc kubenswrapper[4749]: I0310 15:56:00.889831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-dldtb"] Mar 10 15:56:01 crc kubenswrapper[4749]: I0310 15:56:01.897224 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-dldtb" event={"ID":"3d648991-7780-40b1-844f-d735838969c7","Type":"ContainerStarted","Data":"7874d98a7a5dcd505dbfb2e008655f37e2a1a7bf28c93cc9feb3a134fc87431e"} Mar 10 15:56:03 crc kubenswrapper[4749]: I0310 15:56:03.910132 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d648991-7780-40b1-844f-d735838969c7" containerID="c60152907d563889bb402382f1124b478ec87751301278e202a6fc1400fc617f" exitCode=0 Mar 10 15:56:03 crc kubenswrapper[4749]: I0310 15:56:03.910329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-dldtb" event={"ID":"3d648991-7780-40b1-844f-d735838969c7","Type":"ContainerDied","Data":"c60152907d563889bb402382f1124b478ec87751301278e202a6fc1400fc617f"} Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.279115 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.350613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdh2\" (UniqueName: \"kubernetes.io/projected/3d648991-7780-40b1-844f-d735838969c7-kube-api-access-fhdh2\") pod \"3d648991-7780-40b1-844f-d735838969c7\" (UID: \"3d648991-7780-40b1-844f-d735838969c7\") " Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.357578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d648991-7780-40b1-844f-d735838969c7-kube-api-access-fhdh2" (OuterVolumeSpecName: "kube-api-access-fhdh2") pod "3d648991-7780-40b1-844f-d735838969c7" (UID: "3d648991-7780-40b1-844f-d735838969c7"). InnerVolumeSpecName "kube-api-access-fhdh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.452539 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdh2\" (UniqueName: \"kubernetes.io/projected/3d648991-7780-40b1-844f-d735838969c7-kube-api-access-fhdh2\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.929574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-dldtb" event={"ID":"3d648991-7780-40b1-844f-d735838969c7","Type":"ContainerDied","Data":"7874d98a7a5dcd505dbfb2e008655f37e2a1a7bf28c93cc9feb3a134fc87431e"} Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.929641 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7874d98a7a5dcd505dbfb2e008655f37e2a1a7bf28c93cc9feb3a134fc87431e" Mar 10 15:56:05 crc kubenswrapper[4749]: I0310 15:56:05.929672 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-dldtb" Mar 10 15:56:06 crc kubenswrapper[4749]: I0310 15:56:06.356390 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-vvkbm"] Mar 10 15:56:06 crc kubenswrapper[4749]: I0310 15:56:06.360436 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-vvkbm"] Mar 10 15:56:07 crc kubenswrapper[4749]: I0310 15:56:07.617735 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046a02a2-14f4-4368-9f21-58d96a510927" path="/var/lib/kubelet/pods/046a02a2-14f4-4368-9f21-58d96a510927/volumes" Mar 10 15:56:20 crc kubenswrapper[4749]: I0310 15:56:20.981167 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:56:20 crc kubenswrapper[4749]: I0310 15:56:20.981960 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.109479 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" podUID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" containerName="registry" containerID="cri-o://a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c" gracePeriod=30 Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.135083 4749 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-m2m4f container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.23:5000/healthz\": dial tcp 10.217.0.23:5000: connect: connection refused" start-of-body= Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.135171 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" podUID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.23:5000/healthz\": dial tcp 10.217.0.23:5000: connect: connection refused" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.557772 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-installation-pull-secrets\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-certificates\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-ca-trust-extracted\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-bound-sa-token\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-trusted-ca\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mljj6\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-kube-api-access-mljj6\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-tls\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.708677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\" (UID: \"085fc200-fd9e-4e5b-9aef-5a5488c5cb17\") " Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.709534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.710147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.715643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.716001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.716321 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.716917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-kube-api-access-mljj6" (OuterVolumeSpecName: "kube-api-access-mljj6") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "kube-api-access-mljj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.719511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.726292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "085fc200-fd9e-4e5b-9aef-5a5488c5cb17" (UID: "085fc200-fd9e-4e5b-9aef-5a5488c5cb17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810478 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810547 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mljj6\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-kube-api-access-mljj6\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810570 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810582 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810593 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810605 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:22 crc kubenswrapper[4749]: I0310 15:56:22.810617 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/085fc200-fd9e-4e5b-9aef-5a5488c5cb17-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.050454 4749 generic.go:334] "Generic (PLEG): container finished" podID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" containerID="a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c" exitCode=0 Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.050523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" event={"ID":"085fc200-fd9e-4e5b-9aef-5a5488c5cb17","Type":"ContainerDied","Data":"a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c"} Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.050561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" event={"ID":"085fc200-fd9e-4e5b-9aef-5a5488c5cb17","Type":"ContainerDied","Data":"c996ecc19690f519946ad87fbc4b740aa743f344151b2405f6abe804584b7d03"} Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.050531 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m2m4f" Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.050589 4749 scope.go:117] "RemoveContainer" containerID="a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c" Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.069868 4749 scope.go:117] "RemoveContainer" containerID="a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c" Mar 10 15:56:23 crc kubenswrapper[4749]: E0310 15:56:23.070425 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c\": container with ID starting with a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c not found: ID does not exist" containerID="a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c" Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.070465 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c"} err="failed to get container status \"a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c\": rpc error: code = NotFound desc = could not find container \"a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c\": container with ID starting with a75d818ff88dcd43558378065a09dc9c2b1bf350cbacea662004daf262d0617c not found: ID does not exist" Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.080574 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2m4f"] Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.084613 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m2m4f"] Mar 10 15:56:23 crc kubenswrapper[4749]: I0310 15:56:23.614540 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" path="/var/lib/kubelet/pods/085fc200-fd9e-4e5b-9aef-5a5488c5cb17/volumes" Mar 10 15:56:50 crc kubenswrapper[4749]: I0310 15:56:50.980922 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:56:50 crc kubenswrapper[4749]: I0310 15:56:50.981719 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:56:50 crc kubenswrapper[4749]: I0310 15:56:50.981792 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 15:56:50 crc kubenswrapper[4749]: I0310 15:56:50.982738 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"235eec5455f54320042cab9b4bf8ef066c9980bb92c37290b73e4a493f648064"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:56:50 crc kubenswrapper[4749]: I0310 15:56:50.982827 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://235eec5455f54320042cab9b4bf8ef066c9980bb92c37290b73e4a493f648064" gracePeriod=600 Mar 10 15:56:51 crc kubenswrapper[4749]: I0310 15:56:51.232120 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="235eec5455f54320042cab9b4bf8ef066c9980bb92c37290b73e4a493f648064" exitCode=0 Mar 10 15:56:51 crc kubenswrapper[4749]: I0310 15:56:51.232199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"235eec5455f54320042cab9b4bf8ef066c9980bb92c37290b73e4a493f648064"} Mar 10 15:56:51 crc kubenswrapper[4749]: I0310 15:56:51.232278 4749 scope.go:117] "RemoveContainer" containerID="38e5364b4210e2bc848904cf499827c1e2a5cbf1d4f02019fac92f84f23583e7" Mar 10 15:56:52 crc kubenswrapper[4749]: I0310 15:56:52.241341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"7b78ae72f8895fc1df287649b2d990626337b8a539e3e03d294824b60e7e24d6"} Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.142428 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552638-2j4cg"] Mar 10 15:58:00 crc kubenswrapper[4749]: E0310 15:58:00.143409 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d648991-7780-40b1-844f-d735838969c7" containerName="oc" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.143434 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d648991-7780-40b1-844f-d735838969c7" containerName="oc" Mar 10 15:58:00 crc kubenswrapper[4749]: E0310 15:58:00.143464 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" containerName="registry" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.143471 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" containerName="registry" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.143605 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="085fc200-fd9e-4e5b-9aef-5a5488c5cb17" containerName="registry" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.143621 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d648991-7780-40b1-844f-d735838969c7" containerName="oc" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.144164 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.148082 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.148532 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.148835 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.153173 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-2j4cg"] Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.230987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvkzd\" (UniqueName: \"kubernetes.io/projected/de41b32f-3fbe-43cc-b8fd-7dc121e0d686-kube-api-access-fvkzd\") pod \"auto-csr-approver-29552638-2j4cg\" (UID: \"de41b32f-3fbe-43cc-b8fd-7dc121e0d686\") " pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.333207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvkzd\" (UniqueName: \"kubernetes.io/projected/de41b32f-3fbe-43cc-b8fd-7dc121e0d686-kube-api-access-fvkzd\") pod \"auto-csr-approver-29552638-2j4cg\" (UID: \"de41b32f-3fbe-43cc-b8fd-7dc121e0d686\") " pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.355018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvkzd\" (UniqueName: \"kubernetes.io/projected/de41b32f-3fbe-43cc-b8fd-7dc121e0d686-kube-api-access-fvkzd\") pod \"auto-csr-approver-29552638-2j4cg\" (UID: \"de41b32f-3fbe-43cc-b8fd-7dc121e0d686\") " pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.467011 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.686422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-2j4cg"] Mar 10 15:58:00 crc kubenswrapper[4749]: I0310 15:58:00.699983 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:58:01 crc kubenswrapper[4749]: I0310 15:58:01.678638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" event={"ID":"de41b32f-3fbe-43cc-b8fd-7dc121e0d686","Type":"ContainerStarted","Data":"6ea974be82bda9d897eb52228c89712a693481ae01212eaa9c8d0d1bc41619c5"} Mar 10 15:58:02 crc kubenswrapper[4749]: I0310 15:58:02.687238 4749 generic.go:334] "Generic (PLEG): container finished" podID="de41b32f-3fbe-43cc-b8fd-7dc121e0d686" containerID="5db944d8e89880ee300a5fe1c79a6c36b1c6f7337b4fa429574aa3e51954c23b" exitCode=0 Mar 10 15:58:02 crc kubenswrapper[4749]: I0310 15:58:02.687368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" event={"ID":"de41b32f-3fbe-43cc-b8fd-7dc121e0d686","Type":"ContainerDied","Data":"5db944d8e89880ee300a5fe1c79a6c36b1c6f7337b4fa429574aa3e51954c23b"} Mar 10 15:58:03 crc kubenswrapper[4749]: I0310 15:58:03.929509 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:03 crc kubenswrapper[4749]: I0310 15:58:03.983803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvkzd\" (UniqueName: \"kubernetes.io/projected/de41b32f-3fbe-43cc-b8fd-7dc121e0d686-kube-api-access-fvkzd\") pod \"de41b32f-3fbe-43cc-b8fd-7dc121e0d686\" (UID: \"de41b32f-3fbe-43cc-b8fd-7dc121e0d686\") " Mar 10 15:58:03 crc kubenswrapper[4749]: I0310 15:58:03.994617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de41b32f-3fbe-43cc-b8fd-7dc121e0d686-kube-api-access-fvkzd" (OuterVolumeSpecName: "kube-api-access-fvkzd") pod "de41b32f-3fbe-43cc-b8fd-7dc121e0d686" (UID: "de41b32f-3fbe-43cc-b8fd-7dc121e0d686"). InnerVolumeSpecName "kube-api-access-fvkzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:58:04 crc kubenswrapper[4749]: I0310 15:58:04.085950 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvkzd\" (UniqueName: \"kubernetes.io/projected/de41b32f-3fbe-43cc-b8fd-7dc121e0d686-kube-api-access-fvkzd\") on node \"crc\" DevicePath \"\"" Mar 10 15:58:04 crc kubenswrapper[4749]: I0310 15:58:04.705055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" event={"ID":"de41b32f-3fbe-43cc-b8fd-7dc121e0d686","Type":"ContainerDied","Data":"6ea974be82bda9d897eb52228c89712a693481ae01212eaa9c8d0d1bc41619c5"} Mar 10 15:58:04 crc kubenswrapper[4749]: I0310 15:58:04.705106 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea974be82bda9d897eb52228c89712a693481ae01212eaa9c8d0d1bc41619c5" Mar 10 15:58:04 crc kubenswrapper[4749]: I0310 15:58:04.705539 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-2j4cg" Mar 10 15:58:04 crc kubenswrapper[4749]: I0310 15:58:04.995981 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xx7p8"] Mar 10 15:58:04 crc kubenswrapper[4749]: I0310 15:58:04.999235 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xx7p8"] Mar 10 15:58:05 crc kubenswrapper[4749]: I0310 15:58:05.615660 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec11cc4-b4eb-4b0b-b803-832ac4051974" path="/var/lib/kubelet/pods/dec11cc4-b4eb-4b0b-b803-832ac4051974/volumes" Mar 10 15:58:52 crc kubenswrapper[4749]: I0310 15:58:52.762325 4749 scope.go:117] "RemoveContainer" containerID="e79e714b546820829bf59539aedd0158696a451f1a0f0d779efe7b9d5dbe05c5" Mar 10 15:58:52 crc kubenswrapper[4749]: I0310 15:58:52.788680 4749 scope.go:117] "RemoveContainer" containerID="72346773b4f848face18c55b34cc402ef70e709b5b0d6c1ac1ea0618d289831c" Mar 10 15:59:20 crc kubenswrapper[4749]: I0310 15:59:20.981167 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:59:20 crc kubenswrapper[4749]: I0310 15:59:20.982036 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:59:50 crc kubenswrapper[4749]: I0310 15:59:50.980671 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:59:50 crc kubenswrapper[4749]: I0310 15:59:50.981470 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:59:52 crc kubenswrapper[4749]: I0310 15:59:52.837030 4749 scope.go:117] "RemoveContainer" containerID="3a01343d650fc3d5ef536514b1ee7e77aa51c7b3d3328e142d6c7e27b8ce4d74" Mar 10 15:59:52 crc kubenswrapper[4749]: I0310 15:59:52.873252 4749 scope.go:117] "RemoveContainer" containerID="7f9621f29ed38d6ea4048c4fa33039fff9f4e6fdc3241491f571eeae8a8a90c7" Mar 10 15:59:52 crc kubenswrapper[4749]: I0310 15:59:52.914999 4749 scope.go:117] "RemoveContainer" containerID="cdf2b1e8906734bb83dd1444e0fcf3c085b3c1b29c40074eebfdc4e2bbf25dea" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.144239 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552640-jlz6h"] Mar 10 16:00:00 crc kubenswrapper[4749]: E0310 16:00:00.145396 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de41b32f-3fbe-43cc-b8fd-7dc121e0d686" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.145422 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="de41b32f-3fbe-43cc-b8fd-7dc121e0d686" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.145533 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="de41b32f-3fbe-43cc-b8fd-7dc121e0d686" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.146054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.148277 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.148495 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.149683 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n"] Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.150114 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.150809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.152766 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.152802 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.159125 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-jlz6h"] Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.164663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n"] Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.275496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90f3b22b-18d6-498a-a003-ffa4c40f362e-config-volume\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.275656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90f3b22b-18d6-498a-a003-ffa4c40f362e-secret-volume\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.275701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9br6\" (UniqueName: \"kubernetes.io/projected/90f3b22b-18d6-498a-a003-ffa4c40f362e-kube-api-access-b9br6\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.275780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79n7b\" (UniqueName: \"kubernetes.io/projected/d29452a9-eba1-4780-b94c-72b02ca17315-kube-api-access-79n7b\") pod \"auto-csr-approver-29552640-jlz6h\" (UID: \"d29452a9-eba1-4780-b94c-72b02ca17315\") " pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.377703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90f3b22b-18d6-498a-a003-ffa4c40f362e-config-volume\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.377800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90f3b22b-18d6-498a-a003-ffa4c40f362e-secret-volume\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.377821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9br6\" (UniqueName: \"kubernetes.io/projected/90f3b22b-18d6-498a-a003-ffa4c40f362e-kube-api-access-b9br6\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.377855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79n7b\" (UniqueName: \"kubernetes.io/projected/d29452a9-eba1-4780-b94c-72b02ca17315-kube-api-access-79n7b\") pod \"auto-csr-approver-29552640-jlz6h\" (UID: \"d29452a9-eba1-4780-b94c-72b02ca17315\") " pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.379102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90f3b22b-18d6-498a-a003-ffa4c40f362e-config-volume\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.386875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90f3b22b-18d6-498a-a003-ffa4c40f362e-secret-volume\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.398043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9br6\" (UniqueName: \"kubernetes.io/projected/90f3b22b-18d6-498a-a003-ffa4c40f362e-kube-api-access-b9br6\") pod \"collect-profiles-29552640-4sg2n\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.398051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79n7b\" (UniqueName: \"kubernetes.io/projected/d29452a9-eba1-4780-b94c-72b02ca17315-kube-api-access-79n7b\") pod \"auto-csr-approver-29552640-jlz6h\" (UID: \"d29452a9-eba1-4780-b94c-72b02ca17315\") " pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.470826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.480444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.686705 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n"] Mar 10 16:00:00 crc kubenswrapper[4749]: I0310 16:00:00.714612 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-jlz6h"] Mar 10 16:00:00 crc kubenswrapper[4749]: W0310 16:00:00.730342 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd29452a9_eba1_4780_b94c_72b02ca17315.slice/crio-5a6691c7fc28121456211e2878993d37f0817ace3876e8a51a040557cb51010f WatchSource:0}: Error finding container 5a6691c7fc28121456211e2878993d37f0817ace3876e8a51a040557cb51010f: Status 404 returned error can't find the container with id 5a6691c7fc28121456211e2878993d37f0817ace3876e8a51a040557cb51010f Mar 10 16:00:01 crc kubenswrapper[4749]: E0310 16:00:01.126583 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90f3b22b_18d6_498a_a003_ffa4c40f362e.slice/crio-conmon-096d014e267ba0f0ea011913fed5890b35fa832d09571c15acf58fd842b7bb79.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:00:01 crc kubenswrapper[4749]: I0310 16:00:01.480925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" event={"ID":"d29452a9-eba1-4780-b94c-72b02ca17315","Type":"ContainerStarted","Data":"5a6691c7fc28121456211e2878993d37f0817ace3876e8a51a040557cb51010f"} Mar 10 16:00:01 crc kubenswrapper[4749]: I0310 16:00:01.482763 4749 generic.go:334] "Generic (PLEG): container finished" podID="90f3b22b-18d6-498a-a003-ffa4c40f362e" containerID="096d014e267ba0f0ea011913fed5890b35fa832d09571c15acf58fd842b7bb79" exitCode=0 Mar 10 16:00:01 crc kubenswrapper[4749]: I0310 16:00:01.482796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" event={"ID":"90f3b22b-18d6-498a-a003-ffa4c40f362e","Type":"ContainerDied","Data":"096d014e267ba0f0ea011913fed5890b35fa832d09571c15acf58fd842b7bb79"} Mar 10 16:00:01 crc kubenswrapper[4749]: I0310 16:00:01.482814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" event={"ID":"90f3b22b-18d6-498a-a003-ffa4c40f362e","Type":"ContainerStarted","Data":"eb7d55a8ff20eb19b61d62f177372d728e0d671ece860760e68575db830d4cc7"} Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.734939 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.820649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90f3b22b-18d6-498a-a003-ffa4c40f362e-config-volume\") pod \"90f3b22b-18d6-498a-a003-ffa4c40f362e\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.820765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90f3b22b-18d6-498a-a003-ffa4c40f362e-secret-volume\") pod \"90f3b22b-18d6-498a-a003-ffa4c40f362e\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.820875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9br6\" (UniqueName: \"kubernetes.io/projected/90f3b22b-18d6-498a-a003-ffa4c40f362e-kube-api-access-b9br6\") pod \"90f3b22b-18d6-498a-a003-ffa4c40f362e\" (UID: \"90f3b22b-18d6-498a-a003-ffa4c40f362e\") " Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.821982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f3b22b-18d6-498a-a003-ffa4c40f362e-config-volume" (OuterVolumeSpecName: "config-volume") pod "90f3b22b-18d6-498a-a003-ffa4c40f362e" (UID: "90f3b22b-18d6-498a-a003-ffa4c40f362e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.827711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f3b22b-18d6-498a-a003-ffa4c40f362e-kube-api-access-b9br6" (OuterVolumeSpecName: "kube-api-access-b9br6") pod "90f3b22b-18d6-498a-a003-ffa4c40f362e" (UID: "90f3b22b-18d6-498a-a003-ffa4c40f362e"). InnerVolumeSpecName "kube-api-access-b9br6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.828671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f3b22b-18d6-498a-a003-ffa4c40f362e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "90f3b22b-18d6-498a-a003-ffa4c40f362e" (UID: "90f3b22b-18d6-498a-a003-ffa4c40f362e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.923088 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9br6\" (UniqueName: \"kubernetes.io/projected/90f3b22b-18d6-498a-a003-ffa4c40f362e-kube-api-access-b9br6\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.923134 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90f3b22b-18d6-498a-a003-ffa4c40f362e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:02 crc kubenswrapper[4749]: I0310 16:00:02.923145 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/90f3b22b-18d6-498a-a003-ffa4c40f362e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:03 crc kubenswrapper[4749]: I0310 16:00:03.497114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" event={"ID":"90f3b22b-18d6-498a-a003-ffa4c40f362e","Type":"ContainerDied","Data":"eb7d55a8ff20eb19b61d62f177372d728e0d671ece860760e68575db830d4cc7"} Mar 10 16:00:03 crc kubenswrapper[4749]: I0310 16:00:03.497915 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb7d55a8ff20eb19b61d62f177372d728e0d671ece860760e68575db830d4cc7" Mar 10 16:00:03 crc kubenswrapper[4749]: I0310 16:00:03.497225 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n" Mar 10 16:00:14 crc kubenswrapper[4749]: I0310 16:00:14.587978 4749 generic.go:334] "Generic (PLEG): container finished" podID="d29452a9-eba1-4780-b94c-72b02ca17315" containerID="e3238c98207593cd21fd73b85c538bfa562bd50f0e54e0a2d2cc80dc34102b68" exitCode=0 Mar 10 16:00:14 crc kubenswrapper[4749]: I0310 16:00:14.588133 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" event={"ID":"d29452a9-eba1-4780-b94c-72b02ca17315","Type":"ContainerDied","Data":"e3238c98207593cd21fd73b85c538bfa562bd50f0e54e0a2d2cc80dc34102b68"} Mar 10 16:00:15 crc kubenswrapper[4749]: I0310 16:00:15.893600 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.030252 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79n7b\" (UniqueName: \"kubernetes.io/projected/d29452a9-eba1-4780-b94c-72b02ca17315-kube-api-access-79n7b\") pod \"d29452a9-eba1-4780-b94c-72b02ca17315\" (UID: \"d29452a9-eba1-4780-b94c-72b02ca17315\") " Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.037576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29452a9-eba1-4780-b94c-72b02ca17315-kube-api-access-79n7b" (OuterVolumeSpecName: "kube-api-access-79n7b") pod "d29452a9-eba1-4780-b94c-72b02ca17315" (UID: "d29452a9-eba1-4780-b94c-72b02ca17315"). InnerVolumeSpecName "kube-api-access-79n7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.132323 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79n7b\" (UniqueName: \"kubernetes.io/projected/d29452a9-eba1-4780-b94c-72b02ca17315-kube-api-access-79n7b\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.604063 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" event={"ID":"d29452a9-eba1-4780-b94c-72b02ca17315","Type":"ContainerDied","Data":"5a6691c7fc28121456211e2878993d37f0817ace3876e8a51a040557cb51010f"} Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.604484 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6691c7fc28121456211e2878993d37f0817ace3876e8a51a040557cb51010f" Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.604111 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-jlz6h" Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.967311 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-dv6cc"] Mar 10 16:00:16 crc kubenswrapper[4749]: I0310 16:00:16.971223 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-dv6cc"] Mar 10 16:00:17 crc kubenswrapper[4749]: I0310 16:00:17.620658 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59539c85-68ad-4b62-8484-ccda9def3258" path="/var/lib/kubelet/pods/59539c85-68ad-4b62-8484-ccda9def3258/volumes" Mar 10 16:00:20 crc kubenswrapper[4749]: I0310 16:00:20.980882 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:00:20 crc kubenswrapper[4749]: I0310 16:00:20.981304 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:00:20 crc kubenswrapper[4749]: I0310 16:00:20.981411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:00:20 crc kubenswrapper[4749]: I0310 16:00:20.982416 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b78ae72f8895fc1df287649b2d990626337b8a539e3e03d294824b60e7e24d6"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:00:20 crc kubenswrapper[4749]: I0310 16:00:20.982521 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://7b78ae72f8895fc1df287649b2d990626337b8a539e3e03d294824b60e7e24d6" gracePeriod=600 Mar 10 16:00:21 crc kubenswrapper[4749]: I0310 16:00:21.634472 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="7b78ae72f8895fc1df287649b2d990626337b8a539e3e03d294824b60e7e24d6" exitCode=0 Mar 10 16:00:21 crc kubenswrapper[4749]: I0310 16:00:21.634563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"7b78ae72f8895fc1df287649b2d990626337b8a539e3e03d294824b60e7e24d6"} Mar 10 16:00:21 crc kubenswrapper[4749]: I0310 16:00:21.634888 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"d4ea85a6b744107fd1b757efd6ea6aed1ac10e45ac86a77df1413fc6180c0184"} Mar 10 16:00:21 crc kubenswrapper[4749]: I0310 16:00:21.634915 4749 scope.go:117] "RemoveContainer" containerID="235eec5455f54320042cab9b4bf8ef066c9980bb92c37290b73e4a493f648064" Mar 10 16:00:52 crc kubenswrapper[4749]: I0310 16:00:52.995659 4749 scope.go:117] "RemoveContainer" containerID="8595c937dc76ff9e9c7d7c657e2f4293b7ea4e946274d580b3f51966df7734a8" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.074196 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ls45l"] Mar 10 16:01:18 crc kubenswrapper[4749]: E0310 16:01:18.076032 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f3b22b-18d6-498a-a003-ffa4c40f362e" containerName="collect-profiles" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.076069 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f3b22b-18d6-498a-a003-ffa4c40f362e" containerName="collect-profiles" Mar 10 16:01:18 crc kubenswrapper[4749]: E0310 16:01:18.076085 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29452a9-eba1-4780-b94c-72b02ca17315" containerName="oc" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.076093 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29452a9-eba1-4780-b94c-72b02ca17315" containerName="oc" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.076210 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f3b22b-18d6-498a-a003-ffa4c40f362e" containerName="collect-profiles" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.076228 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29452a9-eba1-4780-b94c-72b02ca17315" containerName="oc" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.076774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.079871 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.080165 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.080457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jnthk" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.083890 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.087073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ls45l"] Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.185363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-node-mnt\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.185611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-crc-storage\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.185649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wkw\" (UniqueName: \"kubernetes.io/projected/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-kube-api-access-w4wkw\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.286559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-node-mnt\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.286685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-crc-storage\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.286712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wkw\" (UniqueName: \"kubernetes.io/projected/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-kube-api-access-w4wkw\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.287057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-node-mnt\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.288324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-crc-storage\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.312521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wkw\" (UniqueName: \"kubernetes.io/projected/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-kube-api-access-w4wkw\") pod \"crc-storage-crc-ls45l\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.396213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:18 crc kubenswrapper[4749]: I0310 16:01:18.621211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ls45l"] Mar 10 16:01:19 crc kubenswrapper[4749]: I0310 16:01:19.045855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ls45l" event={"ID":"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c","Type":"ContainerStarted","Data":"d43ec6dab21d96c06745d48db00bea768d985a05c72de018ae503ffd55b9d805"} Mar 10 16:01:21 crc kubenswrapper[4749]: I0310 16:01:21.065163 4749 generic.go:334] "Generic (PLEG): container finished" podID="cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" containerID="5081b4a2d9a3aa46c6b557f9c7d0279c98311e5391bc8c6b27c7d9ef0ea42e7c" exitCode=0 Mar 10 16:01:21 crc kubenswrapper[4749]: I0310 16:01:21.065242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ls45l" event={"ID":"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c","Type":"ContainerDied","Data":"5081b4a2d9a3aa46c6b557f9c7d0279c98311e5391bc8c6b27c7d9ef0ea42e7c"} Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.325731 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.453558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4wkw\" (UniqueName: \"kubernetes.io/projected/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-kube-api-access-w4wkw\") pod \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.453630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-crc-storage\") pod \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.453754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-node-mnt\") pod \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\" (UID: \"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c\") " Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.454048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" (UID: "cf3818d3-7881-44ef-afd9-cb50f8a4bf4c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.459662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-kube-api-access-w4wkw" (OuterVolumeSpecName: "kube-api-access-w4wkw") pod "cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" (UID: "cf3818d3-7881-44ef-afd9-cb50f8a4bf4c"). InnerVolumeSpecName "kube-api-access-w4wkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.471201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" (UID: "cf3818d3-7881-44ef-afd9-cb50f8a4bf4c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.554958 4749 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.555005 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4wkw\" (UniqueName: \"kubernetes.io/projected/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-kube-api-access-w4wkw\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:22 crc kubenswrapper[4749]: I0310 16:01:22.555016 4749 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.079354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ls45l" event={"ID":"cf3818d3-7881-44ef-afd9-cb50f8a4bf4c","Type":"ContainerDied","Data":"d43ec6dab21d96c06745d48db00bea768d985a05c72de018ae503ffd55b9d805"} Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.079437 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ls45l" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.079453 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43ec6dab21d96c06745d48db00bea768d985a05c72de018ae503ffd55b9d805" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.459729 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nvpsq"] Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.460931 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-controller" containerID="cri-o://3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.460986 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="nbdb" containerID="cri-o://540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.461024 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="sbdb" containerID="cri-o://84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.461099 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-node" containerID="cri-o://b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.461128 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-acl-logging" containerID="cri-o://58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.461069 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.461153 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="northd" containerID="cri-o://18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.519050 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" containerID="cri-o://53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" gracePeriod=30 Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.762414 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/3.log" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.765888 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovn-acl-logging/0.log" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.766662 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovn-controller/0.log" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.767414 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829363 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q2cvx"] Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829733 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" containerName="storage" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829755 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" containerName="storage" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829767 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="nbdb" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829776 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="nbdb" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829784 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kubecfg-setup" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829815 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kubecfg-setup" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829825 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="sbdb" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829830 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="sbdb" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829841 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="northd" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829846 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="northd" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829855 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829862 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829872 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829878 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829886 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829892 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829899 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-acl-logging" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829906 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-acl-logging" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829918 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-node" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829925 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-node" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829934 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829940 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829948 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829955 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829963 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829969 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: E0310 16:01:23.829975 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.829980 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830084 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830092 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830100 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830107 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830114 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830125 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="kube-rbac-proxy-node" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830134 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovn-acl-logging" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830142 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="nbdb" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830155 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" containerName="storage" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830164 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="northd" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830173 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="sbdb" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830397 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.830410 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerName="ovnkube-controller" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.832368 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-systemd\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884345 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-config\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-etc-openvswitch\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovn-node-metrics-cert\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884501 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-netns\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-ovn-kubernetes\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-env-overrides\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884574 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-netd\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884599 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-var-lib-openvswitch\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884621 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-openvswitch\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-ovn\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-script-lib\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884724 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-slash\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-slash" (OuterVolumeSpecName: "host-slash") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-bin\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8gk7\" (UniqueName: \"kubernetes.io/projected/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-kube-api-access-s8gk7\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884861 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-log-socket" (OuterVolumeSpecName: "log-socket") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-log-socket\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.884973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-kubelet\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-systemd-units\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-node-log\") pod \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\" (UID: \"fac9a20c-b1f6-4bb2-a363-072abb3c04d2\") " Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885137 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-node-log" (OuterVolumeSpecName: "node-log") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885409 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-var-lib-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885462 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-cni-bin\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8sx\" (UniqueName: \"kubernetes.io/projected/831eb063-b498-45bf-b5af-45851e042a75-kube-api-access-gd8sx\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-log-socket\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-run-netns\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-ovn\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/831eb063-b498-45bf-b5af-45851e042a75-ovn-node-metrics-cert\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.885993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-run-ovn-kubernetes\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-ovnkube-script-lib\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-cni-netd\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-slash\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-systemd\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-ovnkube-config\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-node-log\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-etc-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-kubelet\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-systemd-units\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-env-overrides\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886881 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886892 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886902 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886913 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886923 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886934 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886944 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886954 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886963 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886972 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886983 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.886996 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.887005 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.887014 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.887022 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.887032 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.887042 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.888793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-kube-api-access-s8gk7" (OuterVolumeSpecName: "kube-api-access-s8gk7") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "kube-api-access-s8gk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.889634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.898717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fac9a20c-b1f6-4bb2-a363-072abb3c04d2" (UID: "fac9a20c-b1f6-4bb2-a363-072abb3c04d2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-run-ovn-kubernetes\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-ovnkube-script-lib\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-cni-netd\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-slash\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-run-ovn-kubernetes\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-systemd\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-ovnkube-config\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-node-log\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-etc-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-slash\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-systemd\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-kubelet\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-node-log\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-kubelet\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-systemd-units\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-env-overrides\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-systemd-units\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-var-lib-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-cni-bin\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8sx\" (UniqueName: \"kubernetes.io/projected/831eb063-b498-45bf-b5af-45851e042a75-kube-api-access-gd8sx\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-etc-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-var-lib-openvswitch\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-run-netns\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-cni-netd\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-cni-bin\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.988941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-host-run-netns\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-log-socket\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-ovn\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/831eb063-b498-45bf-b5af-45851e042a75-ovn-node-metrics-cert\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-run-ovn\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989161 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989176 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989189 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8gk7\" (UniqueName: \"kubernetes.io/projected/fac9a20c-b1f6-4bb2-a363-072abb3c04d2-kube-api-access-s8gk7\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989178 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/831eb063-b498-45bf-b5af-45851e042a75-log-socket\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-ovnkube-script-lib\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.989589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-env-overrides\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.990134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/831eb063-b498-45bf-b5af-45851e042a75-ovnkube-config\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:23 crc kubenswrapper[4749]: I0310 16:01:23.992503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/831eb063-b498-45bf-b5af-45851e042a75-ovn-node-metrics-cert\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.009477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8sx\" (UniqueName: \"kubernetes.io/projected/831eb063-b498-45bf-b5af-45851e042a75-kube-api-access-gd8sx\") pod \"ovnkube-node-q2cvx\" (UID: \"831eb063-b498-45bf-b5af-45851e042a75\") " pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.090921 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/2.log" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.092848 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/1.log" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.092907 4749 generic.go:334] "Generic (PLEG): container finished" podID="807d12f5-c95a-4a7e-91c5-128de3d2235c" containerID="750eab5b32a357211fac1cfd9b94b2e5c78d0358f83824912d275e65a6761fa0" exitCode=2 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.092969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerDied","Data":"750eab5b32a357211fac1cfd9b94b2e5c78d0358f83824912d275e65a6761fa0"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.093206 4749 scope.go:117] "RemoveContainer" containerID="5018cd45279aff135b2d7eaa3883f7daf5bb7fbe0ceca1c731299f0aa32c35bd" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.093622 4749 scope.go:117] "RemoveContainer" containerID="750eab5b32a357211fac1cfd9b94b2e5c78d0358f83824912d275e65a6761fa0" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.093857 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gwpmf_openshift-multus(807d12f5-c95a-4a7e-91c5-128de3d2235c)\"" pod="openshift-multus/multus-gwpmf" podUID="807d12f5-c95a-4a7e-91c5-128de3d2235c" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.099344 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovnkube-controller/3.log" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102006 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovn-acl-logging/0.log" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102503 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nvpsq_fac9a20c-b1f6-4bb2-a363-072abb3c04d2/ovn-controller/0.log" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102837 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" exitCode=0 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102865 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" exitCode=0 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102875 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" exitCode=0 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102884 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" exitCode=0 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102892 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" exitCode=0 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102901 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" exitCode=0 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102909 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" exitCode=143 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102917 4749 generic.go:334] "Generic (PLEG): container finished" podID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" exitCode=143 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.102970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103050 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103061 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103067 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103074 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103080 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103085 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103091 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103098 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103105 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103114 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103136 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103145 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103152 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103162 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103169 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103178 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103184 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103190 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103197 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103203 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103225 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103232 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103238 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103243 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103249 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103255 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103260 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103266 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103272 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103277 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" event={"ID":"fac9a20c-b1f6-4bb2-a363-072abb3c04d2","Type":"ContainerDied","Data":"399d8a5766b643b87d73c3fec0ecfa3587f68422cd4590de1f592ecd380e4f04"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103293 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103300 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103306 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103311 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103317 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103322 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103329 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103335 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103340 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103347 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.103100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nvpsq" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.135923 4749 scope.go:117] "RemoveContainer" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.137055 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nvpsq"] Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.142525 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nvpsq"] Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.148110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.156504 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.178242 4749 scope.go:117] "RemoveContainer" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" Mar 10 16:01:24 crc kubenswrapper[4749]: W0310 16:01:24.188939 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831eb063_b498_45bf_b5af_45851e042a75.slice/crio-fefd1989b3f6d3777e9f8642216584386858555648e75618f49c16e1f98369a9 WatchSource:0}: Error finding container fefd1989b3f6d3777e9f8642216584386858555648e75618f49c16e1f98369a9: Status 404 returned error can't find the container with id fefd1989b3f6d3777e9f8642216584386858555648e75618f49c16e1f98369a9 Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.196579 4749 scope.go:117] "RemoveContainer" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.221078 4749 scope.go:117] "RemoveContainer" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.235195 4749 scope.go:117] "RemoveContainer" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.254658 4749 scope.go:117] "RemoveContainer" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.271560 4749 scope.go:117] "RemoveContainer" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.294055 4749 scope.go:117] "RemoveContainer" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.358585 4749 scope.go:117] "RemoveContainer" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.375630 4749 scope.go:117] "RemoveContainer" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.376110 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": container with ID starting with 53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c not found: ID does not exist" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.376142 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} err="failed to get container status \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": rpc error: code = NotFound desc = could not find container \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": container with ID starting with 53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.376176 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.376692 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": container with ID starting with 223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4 not found: ID does not exist" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.376725 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} err="failed to get container status \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": rpc error: code = NotFound desc = could not find container \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": container with ID starting with 223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.376740 4749 scope.go:117] "RemoveContainer" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.376999 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": container with ID starting with 84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f not found: ID does not exist" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377018 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} err="failed to get container status \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": rpc error: code = NotFound desc = could not find container \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": container with ID starting with 84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377030 4749 scope.go:117] "RemoveContainer" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.377270 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": container with ID starting with 540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83 not found: ID does not exist" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377292 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} err="failed to get container status \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": rpc error: code = NotFound desc = could not find container \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": container with ID starting with 540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377307 4749 scope.go:117] "RemoveContainer" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.377603 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": container with ID starting with 18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7 not found: ID does not exist" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377623 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} err="failed to get container status \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": rpc error: code = NotFound desc = could not find container \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": container with ID starting with 18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377636 4749 scope.go:117] "RemoveContainer" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.377893 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": container with ID starting with 24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f not found: ID does not exist" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377919 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} err="failed to get container status \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": rpc error: code = NotFound desc = could not find container \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": container with ID starting with 24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.377934 4749 scope.go:117] "RemoveContainer" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.378216 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": container with ID starting with b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06 not found: ID does not exist" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.378242 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} err="failed to get container status \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": rpc error: code = NotFound desc = could not find container \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": container with ID starting with b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.378258 4749 scope.go:117] "RemoveContainer" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.378557 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": container with ID starting with 58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97 not found: ID does not exist" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.378583 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} err="failed to get container status \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": rpc error: code = NotFound desc = could not find container \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": container with ID starting with 58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.378598 4749 scope.go:117] "RemoveContainer" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.378844 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": container with ID starting with 3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4 not found: ID does not exist" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.378870 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} err="failed to get container status \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": rpc error: code = NotFound desc = could not find container \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": container with ID starting with 3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.378887 4749 scope.go:117] "RemoveContainer" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" Mar 10 16:01:24 crc kubenswrapper[4749]: E0310 16:01:24.379284 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": container with ID starting with 7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6 not found: ID does not exist" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.379309 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} err="failed to get container status \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": rpc error: code = NotFound desc = could not find container \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": container with ID starting with 7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.379329 4749 scope.go:117] "RemoveContainer" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.379694 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} err="failed to get container status \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": rpc error: code = NotFound desc = could not find container \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": container with ID starting with 53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.379714 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380010 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} err="failed to get container status \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": rpc error: code = NotFound desc = could not find container \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": container with ID starting with 223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380034 4749 scope.go:117] "RemoveContainer" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380309 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} err="failed to get container status \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": rpc error: code = NotFound desc = could not find container \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": container with ID starting with 84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380352 4749 scope.go:117] "RemoveContainer" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380658 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} err="failed to get container status \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": rpc error: code = NotFound desc = could not find container \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": container with ID starting with 540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380678 4749 scope.go:117] "RemoveContainer" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380930 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} err="failed to get container status \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": rpc error: code = NotFound desc = could not find container \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": container with ID starting with 18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.380951 4749 scope.go:117] "RemoveContainer" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.381249 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} err="failed to get container status \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": rpc error: code = NotFound desc = could not find container \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": container with ID starting with 24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.381276 4749 scope.go:117] "RemoveContainer" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.381571 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} err="failed to get container status \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": rpc error: code = NotFound desc = could not find container \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": container with ID starting with b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.381591 4749 scope.go:117] "RemoveContainer" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.383664 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} err="failed to get container status \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": rpc error: code = NotFound desc = could not find container \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": container with ID starting with 58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.383685 4749 scope.go:117] "RemoveContainer" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.384297 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} err="failed to get container status \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": rpc error: code = NotFound desc = could not find container \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": container with ID starting with 3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.384350 4749 scope.go:117] "RemoveContainer" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.384892 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} err="failed to get container status \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": rpc error: code = NotFound desc = could not find container \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": container with ID starting with 7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.384927 4749 scope.go:117] "RemoveContainer" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.385235 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} err="failed to get container status \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": rpc error: code = NotFound desc = could not find container \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": container with ID starting with 53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.385306 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.385747 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} err="failed to get container status \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": rpc error: code = NotFound desc = could not find container \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": container with ID starting with 223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.385768 4749 scope.go:117] "RemoveContainer" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.386526 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} err="failed to get container status \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": rpc error: code = NotFound desc = could not find container \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": container with ID starting with 84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.386597 4749 scope.go:117] "RemoveContainer" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.386971 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} err="failed to get container status \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": rpc error: code = NotFound desc = could not find container \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": container with ID starting with 540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.386997 4749 scope.go:117] "RemoveContainer" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.387665 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} err="failed to get container status \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": rpc error: code = NotFound desc = could not find container \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": container with ID starting with 18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.387688 4749 scope.go:117] "RemoveContainer" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.388491 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} err="failed to get container status \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": rpc error: code = NotFound desc = could not find container \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": container with ID starting with 24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.388540 4749 scope.go:117] "RemoveContainer" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389046 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} err="failed to get container status \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": rpc error: code = NotFound desc = could not find container \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": container with ID starting with b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389118 4749 scope.go:117] "RemoveContainer" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389430 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} err="failed to get container status \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": rpc error: code = NotFound desc = could not find container \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": container with ID starting with 58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389453 4749 scope.go:117] "RemoveContainer" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389728 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} err="failed to get container status \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": rpc error: code = NotFound desc = could not find container \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": container with ID starting with 3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389752 4749 scope.go:117] "RemoveContainer" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.389978 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} err="failed to get container status \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": rpc error: code = NotFound desc = could not find container \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": container with ID starting with 7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.390019 4749 scope.go:117] "RemoveContainer" containerID="53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.390268 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c"} err="failed to get container status \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": rpc error: code = NotFound desc = could not find container \"53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c\": container with ID starting with 53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.390292 4749 scope.go:117] "RemoveContainer" containerID="223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.390751 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4"} err="failed to get container status \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": rpc error: code = NotFound desc = could not find container \"223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4\": container with ID starting with 223796e650f37bdd395cdb1125f49177908f665c9323eb9daf3686d7b21e99d4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.390775 4749 scope.go:117] "RemoveContainer" containerID="84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.391056 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f"} err="failed to get container status \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": rpc error: code = NotFound desc = could not find container \"84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f\": container with ID starting with 84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.391079 4749 scope.go:117] "RemoveContainer" containerID="540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.391359 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83"} err="failed to get container status \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": rpc error: code = NotFound desc = could not find container \"540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83\": container with ID starting with 540ea57af6dc299cd11b15a295f44531749e2a0139b5639ebdd0a81233fe6f83 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.391404 4749 scope.go:117] "RemoveContainer" containerID="18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.391806 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7"} err="failed to get container status \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": rpc error: code = NotFound desc = could not find container \"18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7\": container with ID starting with 18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.391824 4749 scope.go:117] "RemoveContainer" containerID="24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.392167 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f"} err="failed to get container status \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": rpc error: code = NotFound desc = could not find container \"24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f\": container with ID starting with 24825dd4be46030f155457164cfc2bf0acaf0c9b242fcd8de549c8bc5338097f not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.392190 4749 scope.go:117] "RemoveContainer" containerID="b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.392501 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06"} err="failed to get container status \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": rpc error: code = NotFound desc = could not find container \"b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06\": container with ID starting with b28a1b01e473e74fef8158562ba34646899f953cbf3a35c2cf3fff394c244e06 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.392524 4749 scope.go:117] "RemoveContainer" containerID="58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.392800 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97"} err="failed to get container status \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": rpc error: code = NotFound desc = could not find container \"58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97\": container with ID starting with 58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.392822 4749 scope.go:117] "RemoveContainer" containerID="3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.393065 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4"} err="failed to get container status \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": rpc error: code = NotFound desc = could not find container \"3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4\": container with ID starting with 3f327b20036a3a38203c933c470a2f87f9f78834f4bb67182d7bbeb36252ccf4 not found: ID does not exist" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.393089 4749 scope.go:117] "RemoveContainer" containerID="7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6" Mar 10 16:01:24 crc kubenswrapper[4749]: I0310 16:01:24.393962 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6"} err="failed to get container status \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": rpc error: code = NotFound desc = could not find container \"7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6\": container with ID starting with 7bc80250bf1576cfc0b402cc909a3e19d425c910839959c42e27a7e5b6a25db6 not found: ID does not exist" Mar 10 16:01:25 crc kubenswrapper[4749]: I0310 16:01:25.112971 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/2.log" Mar 10 16:01:25 crc kubenswrapper[4749]: I0310 16:01:25.115132 4749 generic.go:334] "Generic (PLEG): container finished" podID="831eb063-b498-45bf-b5af-45851e042a75" containerID="100da3c9f9be9da94b1016d6f04e1b2c0c4896844c6fc23944f659642faeae42" exitCode=0 Mar 10 16:01:25 crc kubenswrapper[4749]: I0310 16:01:25.115196 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerDied","Data":"100da3c9f9be9da94b1016d6f04e1b2c0c4896844c6fc23944f659642faeae42"} Mar 10 16:01:25 crc kubenswrapper[4749]: I0310 16:01:25.115234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"fefd1989b3f6d3777e9f8642216584386858555648e75618f49c16e1f98369a9"} Mar 10 16:01:25 crc kubenswrapper[4749]: I0310 16:01:25.616914 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac9a20c-b1f6-4bb2-a363-072abb3c04d2" path="/var/lib/kubelet/pods/fac9a20c-b1f6-4bb2-a363-072abb3c04d2/volumes" Mar 10 16:01:26 crc kubenswrapper[4749]: I0310 16:01:26.128750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"f03c4dbff2d0cc8c0d000db045b2451609584c83d11464e9938f6f7d274f181c"} Mar 10 16:01:26 crc kubenswrapper[4749]: I0310 16:01:26.129422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"303d5144501960fdc7332e6e6995bc27423952bfaea143c3255964b16837e100"} Mar 10 16:01:26 crc kubenswrapper[4749]: I0310 16:01:26.129548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"0ae2e36d995a60a7462209dd7ae1f0fa6c98f8e8ad81fc3e4507cbb159288e61"} Mar 10 16:01:26 crc kubenswrapper[4749]: I0310 16:01:26.129633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"fe60e54d927212277d64c4666d5138dd014367648d6572551e07d36a81691371"} Mar 10 16:01:26 crc kubenswrapper[4749]: I0310 16:01:26.129711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"eca2af6af322feec9a2166ddb2d55ec21c6b0b15539f35e51501fd9785831dbe"} Mar 10 16:01:26 crc kubenswrapper[4749]: I0310 16:01:26.129793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"52e61d5b60b33da177dbf004bffa0b11f67a2beee3c0bb33577da11dc7fc1592"} Mar 10 16:01:28 crc kubenswrapper[4749]: I0310 16:01:28.147517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"069f6e17df7ca15a85d81af35737b829c312673afe96d2c9b678f13b1a0adf77"} Mar 10 16:01:29 crc kubenswrapper[4749]: I0310 16:01:29.874878 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh"] Mar 10 16:01:29 crc kubenswrapper[4749]: I0310 16:01:29.876348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:29 crc kubenswrapper[4749]: I0310 16:01:29.878584 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 16:01:29 crc kubenswrapper[4749]: I0310 16:01:29.976914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:29 crc kubenswrapper[4749]: I0310 16:01:29.977016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfs5v\" (UniqueName: \"kubernetes.io/projected/d2a63022-7abc-4ef6-81fa-da39b0121c51-kube-api-access-sfs5v\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:29 crc kubenswrapper[4749]: I0310 16:01:29.977053 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.078601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfs5v\" (UniqueName: \"kubernetes.io/projected/d2a63022-7abc-4ef6-81fa-da39b0121c51-kube-api-access-sfs5v\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.078693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.078766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.079332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.079544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.107604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfs5v\" (UniqueName: \"kubernetes.io/projected/d2a63022-7abc-4ef6-81fa-da39b0121c51-kube-api-access-sfs5v\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: I0310 16:01:30.197006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: E0310 16:01:30.239650 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(a67864fae15030a0556c41b7b89a092746a9f15579077123d0ae1269665ac786): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 16:01:30 crc kubenswrapper[4749]: E0310 16:01:30.239760 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(a67864fae15030a0556c41b7b89a092746a9f15579077123d0ae1269665ac786): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: E0310 16:01:30.239789 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(a67864fae15030a0556c41b7b89a092746a9f15579077123d0ae1269665ac786): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:30 crc kubenswrapper[4749]: E0310 16:01:30.239856 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace(d2a63022-7abc-4ef6-81fa-da39b0121c51)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace(d2a63022-7abc-4ef6-81fa-da39b0121c51)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(a67864fae15030a0556c41b7b89a092746a9f15579077123d0ae1269665ac786): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" Mar 10 16:01:31 crc kubenswrapper[4749]: I0310 16:01:31.141436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh"] Mar 10 16:01:31 crc kubenswrapper[4749]: I0310 16:01:31.172151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" event={"ID":"831eb063-b498-45bf-b5af-45851e042a75","Type":"ContainerStarted","Data":"8706be9dfbcd257a20d361373e121b26dc49a002bac955444128bdc1110961f1"} Mar 10 16:01:31 crc kubenswrapper[4749]: I0310 16:01:31.172184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:31 crc kubenswrapper[4749]: I0310 16:01:31.172838 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:31 crc kubenswrapper[4749]: E0310 16:01:31.200600 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(cd52eff9cdde4d49e81eef34e17fabc5cfec0c7d5cc9f94034fa02d786c26d34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 16:01:31 crc kubenswrapper[4749]: E0310 16:01:31.200706 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(cd52eff9cdde4d49e81eef34e17fabc5cfec0c7d5cc9f94034fa02d786c26d34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:31 crc kubenswrapper[4749]: E0310 16:01:31.200742 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(cd52eff9cdde4d49e81eef34e17fabc5cfec0c7d5cc9f94034fa02d786c26d34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:31 crc kubenswrapper[4749]: E0310 16:01:31.200801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace(d2a63022-7abc-4ef6-81fa-da39b0121c51)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace(d2a63022-7abc-4ef6-81fa-da39b0121c51)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(cd52eff9cdde4d49e81eef34e17fabc5cfec0c7d5cc9f94034fa02d786c26d34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" Mar 10 16:01:31 crc kubenswrapper[4749]: I0310 16:01:31.206571 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" podStartSLOduration=8.206549462 podStartE2EDuration="8.206549462s" podCreationTimestamp="2026-03-10 16:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:01:31.20275738 +0000 UTC m=+788.324623077" watchObservedRunningTime="2026-03-10 16:01:31.206549462 +0000 UTC m=+788.328415149" Mar 10 16:01:32 crc kubenswrapper[4749]: I0310 16:01:32.177359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:32 crc kubenswrapper[4749]: I0310 16:01:32.177423 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:32 crc kubenswrapper[4749]: I0310 16:01:32.177468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:32 crc kubenswrapper[4749]: I0310 16:01:32.210185 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:32 crc kubenswrapper[4749]: I0310 16:01:32.210613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:32 crc kubenswrapper[4749]: E0310 16:01:32.366436 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:01:37 crc kubenswrapper[4749]: I0310 16:01:37.607275 4749 scope.go:117] "RemoveContainer" containerID="750eab5b32a357211fac1cfd9b94b2e5c78d0358f83824912d275e65a6761fa0" Mar 10 16:01:37 crc kubenswrapper[4749]: E0310 16:01:37.607971 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gwpmf_openshift-multus(807d12f5-c95a-4a7e-91c5-128de3d2235c)\"" pod="openshift-multus/multus-gwpmf" podUID="807d12f5-c95a-4a7e-91c5-128de3d2235c" Mar 10 16:01:42 crc kubenswrapper[4749]: E0310 16:01:42.489099 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:01:42 crc kubenswrapper[4749]: I0310 16:01:42.606235 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:42 crc kubenswrapper[4749]: I0310 16:01:42.607007 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:42 crc kubenswrapper[4749]: E0310 16:01:42.642903 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(ed15f9fad773380966323e61867dde0ed47939fb024fc692a16a85e1beae49e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 16:01:42 crc kubenswrapper[4749]: E0310 16:01:42.643004 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(ed15f9fad773380966323e61867dde0ed47939fb024fc692a16a85e1beae49e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:42 crc kubenswrapper[4749]: E0310 16:01:42.643039 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(ed15f9fad773380966323e61867dde0ed47939fb024fc692a16a85e1beae49e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:42 crc kubenswrapper[4749]: E0310 16:01:42.643132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace(d2a63022-7abc-4ef6-81fa-da39b0121c51)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace(d2a63022-7abc-4ef6-81fa-da39b0121c51)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_openshift-marketplace_d2a63022-7abc-4ef6-81fa-da39b0121c51_0(ed15f9fad773380966323e61867dde0ed47939fb024fc692a16a85e1beae49e1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" Mar 10 16:01:49 crc kubenswrapper[4749]: I0310 16:01:49.607581 4749 scope.go:117] "RemoveContainer" containerID="750eab5b32a357211fac1cfd9b94b2e5c78d0358f83824912d275e65a6761fa0" Mar 10 16:01:50 crc kubenswrapper[4749]: I0310 16:01:50.303671 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gwpmf_807d12f5-c95a-4a7e-91c5-128de3d2235c/kube-multus/2.log" Mar 10 16:01:50 crc kubenswrapper[4749]: I0310 16:01:50.304083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gwpmf" event={"ID":"807d12f5-c95a-4a7e-91c5-128de3d2235c","Type":"ContainerStarted","Data":"a990ff7a6ab9aaf40cc0f09b27634d4bd470c52055849b8cb6e6dcf68b3c86b7"} Mar 10 16:01:52 crc kubenswrapper[4749]: E0310 16:01:52.652548 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:01:54 crc kubenswrapper[4749]: I0310 16:01:54.181502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q2cvx" Mar 10 16:01:54 crc kubenswrapper[4749]: I0310 16:01:54.606671 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:54 crc kubenswrapper[4749]: I0310 16:01:54.607596 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:01:54 crc kubenswrapper[4749]: I0310 16:01:54.832370 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh"] Mar 10 16:01:55 crc kubenswrapper[4749]: I0310 16:01:55.340665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" event={"ID":"d2a63022-7abc-4ef6-81fa-da39b0121c51","Type":"ContainerStarted","Data":"938ac8477ea05a1591a51fc7329365453ab9351e63bbd7bd4e26508a5a4878ff"} Mar 10 16:01:55 crc kubenswrapper[4749]: I0310 16:01:55.341187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" event={"ID":"d2a63022-7abc-4ef6-81fa-da39b0121c51","Type":"ContainerStarted","Data":"14db064859d9737eca05fb016aa824a225957aea96f0f294669c0dd96c0a4d6d"} Mar 10 16:01:57 crc kubenswrapper[4749]: I0310 16:01:57.357560 4749 generic.go:334] "Generic (PLEG): container finished" podID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerID="938ac8477ea05a1591a51fc7329365453ab9351e63bbd7bd4e26508a5a4878ff" exitCode=0 Mar 10 16:01:57 crc kubenswrapper[4749]: I0310 16:01:57.357685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" event={"ID":"d2a63022-7abc-4ef6-81fa-da39b0121c51","Type":"ContainerDied","Data":"938ac8477ea05a1591a51fc7329365453ab9351e63bbd7bd4e26508a5a4878ff"} Mar 10 16:01:59 crc kubenswrapper[4749]: I0310 16:01:59.373601 4749 generic.go:334] "Generic (PLEG): container finished" podID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerID="b5f706084d7a1d77ccbfdcaef92f50aff4093517599e6cde9e505297456bda1d" exitCode=0 Mar 10 16:01:59 crc kubenswrapper[4749]: I0310 16:01:59.373733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" event={"ID":"d2a63022-7abc-4ef6-81fa-da39b0121c51","Type":"ContainerDied","Data":"b5f706084d7a1d77ccbfdcaef92f50aff4093517599e6cde9e505297456bda1d"} Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.145054 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552642-9nnnj"] Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.146050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.150050 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.150908 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.153468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-9nnnj"] Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.154706 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.175943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgq7\" (UniqueName: \"kubernetes.io/projected/460ee780-c5e3-437a-9a2a-3ed268e2173a-kube-api-access-6tgq7\") pod \"auto-csr-approver-29552642-9nnnj\" (UID: \"460ee780-c5e3-437a-9a2a-3ed268e2173a\") " pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.277164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgq7\" (UniqueName: \"kubernetes.io/projected/460ee780-c5e3-437a-9a2a-3ed268e2173a-kube-api-access-6tgq7\") pod \"auto-csr-approver-29552642-9nnnj\" (UID: \"460ee780-c5e3-437a-9a2a-3ed268e2173a\") " pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.306769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgq7\" (UniqueName: \"kubernetes.io/projected/460ee780-c5e3-437a-9a2a-3ed268e2173a-kube-api-access-6tgq7\") pod \"auto-csr-approver-29552642-9nnnj\" (UID: \"460ee780-c5e3-437a-9a2a-3ed268e2173a\") " pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.384517 4749 generic.go:334] "Generic (PLEG): container finished" podID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerID="d4b41e8370c7733a21b75c2b7df7dbf9ddcea72be100d49f9e1e12041fbfca2d" exitCode=0 Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.384563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" event={"ID":"d2a63022-7abc-4ef6-81fa-da39b0121c51","Type":"ContainerDied","Data":"d4b41e8370c7733a21b75c2b7df7dbf9ddcea72be100d49f9e1e12041fbfca2d"} Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.478539 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:00 crc kubenswrapper[4749]: I0310 16:02:00.692458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-9nnnj"] Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.393874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" event={"ID":"460ee780-c5e3-437a-9a2a-3ed268e2173a","Type":"ContainerStarted","Data":"83d074723663254631b9dd0d9e6119b4907ea60bb4d1f483bd9d691f60f551e6"} Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.699346 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.901070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-util\") pod \"d2a63022-7abc-4ef6-81fa-da39b0121c51\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.901288 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-bundle\") pod \"d2a63022-7abc-4ef6-81fa-da39b0121c51\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.901339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfs5v\" (UniqueName: \"kubernetes.io/projected/d2a63022-7abc-4ef6-81fa-da39b0121c51-kube-api-access-sfs5v\") pod \"d2a63022-7abc-4ef6-81fa-da39b0121c51\" (UID: \"d2a63022-7abc-4ef6-81fa-da39b0121c51\") " Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.902266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-bundle" (OuterVolumeSpecName: "bundle") pod "d2a63022-7abc-4ef6-81fa-da39b0121c51" (UID: "d2a63022-7abc-4ef6-81fa-da39b0121c51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.907529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a63022-7abc-4ef6-81fa-da39b0121c51-kube-api-access-sfs5v" (OuterVolumeSpecName: "kube-api-access-sfs5v") pod "d2a63022-7abc-4ef6-81fa-da39b0121c51" (UID: "d2a63022-7abc-4ef6-81fa-da39b0121c51"). InnerVolumeSpecName "kube-api-access-sfs5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:02:01 crc kubenswrapper[4749]: I0310 16:02:01.915855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-util" (OuterVolumeSpecName: "util") pod "d2a63022-7abc-4ef6-81fa-da39b0121c51" (UID: "d2a63022-7abc-4ef6-81fa-da39b0121c51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:02:02 crc kubenswrapper[4749]: I0310 16:02:02.003349 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:02 crc kubenswrapper[4749]: I0310 16:02:02.004001 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfs5v\" (UniqueName: \"kubernetes.io/projected/d2a63022-7abc-4ef6-81fa-da39b0121c51-kube-api-access-sfs5v\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:02 crc kubenswrapper[4749]: I0310 16:02:02.004041 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2a63022-7abc-4ef6-81fa-da39b0121c51-util\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:02 crc kubenswrapper[4749]: I0310 16:02:02.402910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" event={"ID":"d2a63022-7abc-4ef6-81fa-da39b0121c51","Type":"ContainerDied","Data":"14db064859d9737eca05fb016aa824a225957aea96f0f294669c0dd96c0a4d6d"} Mar 10 16:02:02 crc kubenswrapper[4749]: I0310 16:02:02.402967 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14db064859d9737eca05fb016aa824a225957aea96f0f294669c0dd96c0a4d6d" Mar 10 16:02:02 crc kubenswrapper[4749]: I0310 16:02:02.403107 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh" Mar 10 16:02:02 crc kubenswrapper[4749]: E0310 16:02:02.798549 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:02:03 crc kubenswrapper[4749]: I0310 16:02:03.412956 4749 generic.go:334] "Generic (PLEG): container finished" podID="460ee780-c5e3-437a-9a2a-3ed268e2173a" containerID="5eb2f36eb800ec1839f8b6d93dbba154cda17aa966028a77c2bbf78455bc15bf" exitCode=0 Mar 10 16:02:03 crc kubenswrapper[4749]: I0310 16:02:03.413046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" event={"ID":"460ee780-c5e3-437a-9a2a-3ed268e2173a","Type":"ContainerDied","Data":"5eb2f36eb800ec1839f8b6d93dbba154cda17aa966028a77c2bbf78455bc15bf"} Mar 10 16:02:04 crc kubenswrapper[4749]: I0310 16:02:04.700072 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:04 crc kubenswrapper[4749]: I0310 16:02:04.841694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tgq7\" (UniqueName: \"kubernetes.io/projected/460ee780-c5e3-437a-9a2a-3ed268e2173a-kube-api-access-6tgq7\") pod \"460ee780-c5e3-437a-9a2a-3ed268e2173a\" (UID: \"460ee780-c5e3-437a-9a2a-3ed268e2173a\") " Mar 10 16:02:04 crc kubenswrapper[4749]: I0310 16:02:04.848963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460ee780-c5e3-437a-9a2a-3ed268e2173a-kube-api-access-6tgq7" (OuterVolumeSpecName: "kube-api-access-6tgq7") pod "460ee780-c5e3-437a-9a2a-3ed268e2173a" (UID: "460ee780-c5e3-437a-9a2a-3ed268e2173a"). InnerVolumeSpecName "kube-api-access-6tgq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:02:04 crc kubenswrapper[4749]: I0310 16:02:04.943802 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tgq7\" (UniqueName: \"kubernetes.io/projected/460ee780-c5e3-437a-9a2a-3ed268e2173a-kube-api-access-6tgq7\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:04 crc kubenswrapper[4749]: I0310 16:02:04.993018 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 16:02:05 crc kubenswrapper[4749]: I0310 16:02:05.428882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" event={"ID":"460ee780-c5e3-437a-9a2a-3ed268e2173a","Type":"ContainerDied","Data":"83d074723663254631b9dd0d9e6119b4907ea60bb4d1f483bd9d691f60f551e6"} Mar 10 16:02:05 crc kubenswrapper[4749]: I0310 16:02:05.429430 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83d074723663254631b9dd0d9e6119b4907ea60bb4d1f483bd9d691f60f551e6" Mar 10 16:02:05 crc kubenswrapper[4749]: I0310 16:02:05.428961 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-9nnnj" Mar 10 16:02:05 crc kubenswrapper[4749]: I0310 16:02:05.786517 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-dldtb"] Mar 10 16:02:05 crc kubenswrapper[4749]: I0310 16:02:05.790304 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-dldtb"] Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.553900 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p"] Mar 10 16:02:06 crc kubenswrapper[4749]: E0310 16:02:06.554186 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="util" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.554201 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="util" Mar 10 16:02:06 crc kubenswrapper[4749]: E0310 16:02:06.554212 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460ee780-c5e3-437a-9a2a-3ed268e2173a" containerName="oc" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.554220 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="460ee780-c5e3-437a-9a2a-3ed268e2173a" containerName="oc" Mar 10 16:02:06 crc kubenswrapper[4749]: E0310 16:02:06.554232 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="extract" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.554240 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="extract" Mar 10 16:02:06 crc kubenswrapper[4749]: E0310 16:02:06.554257 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="pull" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.554264 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="pull" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.554408 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="460ee780-c5e3-437a-9a2a-3ed268e2173a" containerName="oc" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.554423 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a63022-7abc-4ef6-81fa-da39b0121c51" containerName="extract" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.555425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.558426 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cgvcq" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.558592 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.558611 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.566708 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p"] Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.568850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg66v\" (UniqueName: \"kubernetes.io/projected/fb72dd01-1d4a-4322-936b-60a188b23af8-kube-api-access-fg66v\") pod \"nmstate-operator-75c5dccd6c-mht6p\" (UID: \"fb72dd01-1d4a-4322-936b-60a188b23af8\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.670136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg66v\" (UniqueName: \"kubernetes.io/projected/fb72dd01-1d4a-4322-936b-60a188b23af8-kube-api-access-fg66v\") pod \"nmstate-operator-75c5dccd6c-mht6p\" (UID: \"fb72dd01-1d4a-4322-936b-60a188b23af8\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.689615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg66v\" (UniqueName: \"kubernetes.io/projected/fb72dd01-1d4a-4322-936b-60a188b23af8-kube-api-access-fg66v\") pod \"nmstate-operator-75c5dccd6c-mht6p\" (UID: \"fb72dd01-1d4a-4322-936b-60a188b23af8\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" Mar 10 16:02:06 crc kubenswrapper[4749]: I0310 16:02:06.877087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" Mar 10 16:02:07 crc kubenswrapper[4749]: I0310 16:02:07.114552 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p"] Mar 10 16:02:07 crc kubenswrapper[4749]: W0310 16:02:07.126337 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb72dd01_1d4a_4322_936b_60a188b23af8.slice/crio-cf2ee3ecdfd555b0282a459e660b96abf4f4407dfbfc5a75b2a22a9ceb1fe7d5 WatchSource:0}: Error finding container cf2ee3ecdfd555b0282a459e660b96abf4f4407dfbfc5a75b2a22a9ceb1fe7d5: Status 404 returned error can't find the container with id cf2ee3ecdfd555b0282a459e660b96abf4f4407dfbfc5a75b2a22a9ceb1fe7d5 Mar 10 16:02:07 crc kubenswrapper[4749]: I0310 16:02:07.442796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" event={"ID":"fb72dd01-1d4a-4322-936b-60a188b23af8","Type":"ContainerStarted","Data":"cf2ee3ecdfd555b0282a459e660b96abf4f4407dfbfc5a75b2a22a9ceb1fe7d5"} Mar 10 16:02:07 crc kubenswrapper[4749]: I0310 16:02:07.616780 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d648991-7780-40b1-844f-d735838969c7" path="/var/lib/kubelet/pods/3d648991-7780-40b1-844f-d735838969c7/volumes" Mar 10 16:02:10 crc kubenswrapper[4749]: I0310 16:02:10.464414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" event={"ID":"fb72dd01-1d4a-4322-936b-60a188b23af8","Type":"ContainerStarted","Data":"712758b947a9bd62c362e08fadbfd60fb90896045875c91d3a30f3d3a5d3fb95"} Mar 10 16:02:10 crc kubenswrapper[4749]: I0310 16:02:10.498542 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-mht6p" podStartSLOduration=2.077577662 podStartE2EDuration="4.49851589s" podCreationTimestamp="2026-03-10 16:02:06 +0000 UTC" firstStartedPulling="2026-03-10 16:02:07.128130292 +0000 UTC m=+824.249995979" lastFinishedPulling="2026-03-10 16:02:09.54906852 +0000 UTC m=+826.670934207" observedRunningTime="2026-03-10 16:02:10.496093385 +0000 UTC m=+827.617959092" watchObservedRunningTime="2026-03-10 16:02:10.49851589 +0000 UTC m=+827.620381587" Mar 10 16:02:12 crc kubenswrapper[4749]: E0310 16:02:12.972897 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.425598 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-t8wjh"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.427506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.433770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2grb8" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.435758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-t8wjh"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.440609 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.441488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.446819 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.451981 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.461328 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-z4rkn"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.462018 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.586740 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.587619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.589924 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.590303 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-65568" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.590522 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j2pm\" (UniqueName: \"kubernetes.io/projected/6951e69a-ab5b-48c2-9de9-b70d82ec527e-kube-api-access-8j2pm\") pod \"nmstate-webhook-786f45cff4-gn4cf\" (UID: \"6951e69a-ab5b-48c2-9de9-b70d82ec527e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-ovs-socket\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xpm\" (UniqueName: \"kubernetes.io/projected/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-kube-api-access-t9xpm\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-nmstate-lock\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6951e69a-ab5b-48c2-9de9-b70d82ec527e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gn4cf\" (UID: \"6951e69a-ab5b-48c2-9de9-b70d82ec527e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-dbus-socket\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.592740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zrk\" (UniqueName: \"kubernetes.io/projected/19dc9ec1-eeaa-4d4b-a800-cc90a945eef5-kube-api-access-w5zrk\") pod \"nmstate-metrics-69594cc75-t8wjh\" (UID: \"19dc9ec1-eeaa-4d4b-a800-cc90a945eef5\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.596929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6951e69a-ab5b-48c2-9de9-b70d82ec527e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gn4cf\" (UID: \"6951e69a-ab5b-48c2-9de9-b70d82ec527e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-dbus-socket\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zrk\" (UniqueName: \"kubernetes.io/projected/19dc9ec1-eeaa-4d4b-a800-cc90a945eef5-kube-api-access-w5zrk\") pod \"nmstate-metrics-69594cc75-t8wjh\" (UID: \"19dc9ec1-eeaa-4d4b-a800-cc90a945eef5\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j2pm\" (UniqueName: \"kubernetes.io/projected/6951e69a-ab5b-48c2-9de9-b70d82ec527e-kube-api-access-8j2pm\") pod \"nmstate-webhook-786f45cff4-gn4cf\" (UID: \"6951e69a-ab5b-48c2-9de9-b70d82ec527e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-ovs-socket\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xpm\" (UniqueName: \"kubernetes.io/projected/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-kube-api-access-t9xpm\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-nmstate-lock\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cc8b\" (UniqueName: \"kubernetes.io/projected/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-kube-api-access-7cc8b\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-ovs-socket\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-dbus-socket\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.693972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-nmstate-lock\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.709495 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6951e69a-ab5b-48c2-9de9-b70d82ec527e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gn4cf\" (UID: \"6951e69a-ab5b-48c2-9de9-b70d82ec527e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.721922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zrk\" (UniqueName: \"kubernetes.io/projected/19dc9ec1-eeaa-4d4b-a800-cc90a945eef5-kube-api-access-w5zrk\") pod \"nmstate-metrics-69594cc75-t8wjh\" (UID: \"19dc9ec1-eeaa-4d4b-a800-cc90a945eef5\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.738218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j2pm\" (UniqueName: \"kubernetes.io/projected/6951e69a-ab5b-48c2-9de9-b70d82ec527e-kube-api-access-8j2pm\") pod \"nmstate-webhook-786f45cff4-gn4cf\" (UID: \"6951e69a-ab5b-48c2-9de9-b70d82ec527e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.747871 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.748330 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xpm\" (UniqueName: \"kubernetes.io/projected/4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb-kube-api-access-t9xpm\") pod \"nmstate-handler-z4rkn\" (UID: \"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb\") " pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.773356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.785621 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.798112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cc8b\" (UniqueName: \"kubernetes.io/projected/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-kube-api-access-7cc8b\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.798187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.798229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: E0310 16:02:22.798558 4749 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 10 16:02:22 crc kubenswrapper[4749]: E0310 16:02:22.798695 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-plugin-serving-cert podName:a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4 nodeName:}" failed. No retries permitted until 2026-03-10 16:02:23.298673078 +0000 UTC m=+840.420538765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-phg98" (UID: "a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4") : secret "plugin-serving-cert" not found Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.799902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.829535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cc8b\" (UniqueName: \"kubernetes.io/projected/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-kube-api-access-7cc8b\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.897913 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dd79b944c-wqxtv"] Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.898822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:22 crc kubenswrapper[4749]: I0310 16:02:22.914454 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd79b944c-wqxtv"] Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.000513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-oauth-config\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.000933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-oauth-serving-cert\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.000973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-serving-cert\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.000994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-config\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.001013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n52l\" (UniqueName: \"kubernetes.io/projected/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-kube-api-access-8n52l\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.001049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-trusted-ca-bundle\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.001087 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-service-ca\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.060532 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf"] Mar 10 16:02:23 crc kubenswrapper[4749]: W0310 16:02:23.068597 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6951e69a_ab5b_48c2_9de9_b70d82ec527e.slice/crio-cb1475d03fe24f24dc85a6c2763a5f9310978ea8599da1797a84465b110914c5 WatchSource:0}: Error finding container cb1475d03fe24f24dc85a6c2763a5f9310978ea8599da1797a84465b110914c5: Status 404 returned error can't find the container with id cb1475d03fe24f24dc85a6c2763a5f9310978ea8599da1797a84465b110914c5 Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-oauth-config\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-oauth-serving-cert\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-serving-cert\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-config\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n52l\" (UniqueName: \"kubernetes.io/projected/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-kube-api-access-8n52l\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-trusted-ca-bundle\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.102999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-service-ca\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.106998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-service-ca\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.106781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-config\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: E0310 16:02:23.106984 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-53462f1397163559b9ec6347cf02e731bcbf2f7398cd2f0dd8bdb42a5e19ac5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-58becd66e548cf16c8cf6ccf22052fd7279718e4ef0ec29eb37624cf51bdfc97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-84d8ea083f495d9e3df019fc59371f25b7e764d8ef2f669b40a12327ceefa73f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac9a20c_b1f6_4bb2_a363_072abb3c04d2.slice/crio-conmon-18e3257604056bc9497566ff21d70997db2d6307e4cdc12d9f60e4031c776eb7.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.108407 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-oauth-serving-cert\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.108756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-trusted-ca-bundle\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.110832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-oauth-config\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.112532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-console-serving-cert\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.117417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-t8wjh"] Mar 10 16:02:23 crc kubenswrapper[4749]: W0310 16:02:23.125870 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19dc9ec1_eeaa_4d4b_a800_cc90a945eef5.slice/crio-9da4378b6a2e019bc07a5c01cb330b0552dc915961d73a2afbce7a2d7d62fe0f WatchSource:0}: Error finding container 9da4378b6a2e019bc07a5c01cb330b0552dc915961d73a2afbce7a2d7d62fe0f: Status 404 returned error can't find the container with id 9da4378b6a2e019bc07a5c01cb330b0552dc915961d73a2afbce7a2d7d62fe0f Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.128023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n52l\" (UniqueName: \"kubernetes.io/projected/69da62cc-f6d4-4ea9-a6ce-5db6d2c80694-kube-api-access-8n52l\") pod \"console-5dd79b944c-wqxtv\" (UID: \"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694\") " pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.214438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.305751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.312554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-phg98\" (UID: \"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.506541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.570370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4rkn" event={"ID":"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb","Type":"ContainerStarted","Data":"ba978e01fbf66e5d76ff2381429f3bd45a2503deaf3f279286ac31f4edd801d6"} Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.571642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" event={"ID":"6951e69a-ab5b-48c2-9de9-b70d82ec527e","Type":"ContainerStarted","Data":"cb1475d03fe24f24dc85a6c2763a5f9310978ea8599da1797a84465b110914c5"} Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.572624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" event={"ID":"19dc9ec1-eeaa-4d4b-a800-cc90a945eef5","Type":"ContainerStarted","Data":"9da4378b6a2e019bc07a5c01cb330b0552dc915961d73a2afbce7a2d7d62fe0f"} Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.636892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd79b944c-wqxtv"] Mar 10 16:02:23 crc kubenswrapper[4749]: I0310 16:02:23.741276 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98"] Mar 10 16:02:23 crc kubenswrapper[4749]: W0310 16:02:23.759312 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2ab74d0_8aa2_4e6b_9ef9_d9213935bdd4.slice/crio-f54d956e633c1374e7b41252c26a78b69a0dd0d93814581073bf6ba72ff19c91 WatchSource:0}: Error finding container f54d956e633c1374e7b41252c26a78b69a0dd0d93814581073bf6ba72ff19c91: Status 404 returned error can't find the container with id f54d956e633c1374e7b41252c26a78b69a0dd0d93814581073bf6ba72ff19c91 Mar 10 16:02:24 crc kubenswrapper[4749]: I0310 16:02:24.581572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" event={"ID":"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4","Type":"ContainerStarted","Data":"f54d956e633c1374e7b41252c26a78b69a0dd0d93814581073bf6ba72ff19c91"} Mar 10 16:02:24 crc kubenswrapper[4749]: I0310 16:02:24.583528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd79b944c-wqxtv" event={"ID":"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694","Type":"ContainerStarted","Data":"818aab5af136bd14c8897e31425d7752f6d18eb2ebbfe83d68cf43308b24145d"} Mar 10 16:02:24 crc kubenswrapper[4749]: I0310 16:02:24.583572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd79b944c-wqxtv" event={"ID":"69da62cc-f6d4-4ea9-a6ce-5db6d2c80694","Type":"ContainerStarted","Data":"4cf09c0d48a7bb21e87e27f8c725ba3132d6e3a0fbcf30d98a0d56ff9b19996a"} Mar 10 16:02:24 crc kubenswrapper[4749]: I0310 16:02:24.605504 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dd79b944c-wqxtv" podStartSLOduration=2.605472427 podStartE2EDuration="2.605472427s" podCreationTimestamp="2026-03-10 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:02:24.603992927 +0000 UTC m=+841.725858614" watchObservedRunningTime="2026-03-10 16:02:24.605472427 +0000 UTC m=+841.727338114" Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.599799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z4rkn" event={"ID":"4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb","Type":"ContainerStarted","Data":"59deee80de9aec47697a375c1e476ee2c8b9e1b7441eec255ba4baaaeb28aa71"} Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.600811 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.603103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" event={"ID":"6951e69a-ab5b-48c2-9de9-b70d82ec527e","Type":"ContainerStarted","Data":"bcb4e105ca91df056cc1dfa2e6c61df38e4c40c073aaac1e89921f4561757ea9"} Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.603296 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.606750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" event={"ID":"19dc9ec1-eeaa-4d4b-a800-cc90a945eef5","Type":"ContainerStarted","Data":"ac8e17ba58691ce3cb0c182ce7ce8916bd9c20739a12302d79019fd85fb22af6"} Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.619854 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-z4rkn" podStartSLOduration=1.848837769 podStartE2EDuration="4.619833679s" podCreationTimestamp="2026-03-10 16:02:22 +0000 UTC" firstStartedPulling="2026-03-10 16:02:22.852015846 +0000 UTC m=+839.973881533" lastFinishedPulling="2026-03-10 16:02:25.623011706 +0000 UTC m=+842.744877443" observedRunningTime="2026-03-10 16:02:26.61841802 +0000 UTC m=+843.740283707" watchObservedRunningTime="2026-03-10 16:02:26.619833679 +0000 UTC m=+843.741699366" Mar 10 16:02:26 crc kubenswrapper[4749]: I0310 16:02:26.652196 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" podStartSLOduration=1.996560487 podStartE2EDuration="4.652170233s" podCreationTimestamp="2026-03-10 16:02:22 +0000 UTC" firstStartedPulling="2026-03-10 16:02:23.07790136 +0000 UTC m=+840.199767047" lastFinishedPulling="2026-03-10 16:02:25.733511086 +0000 UTC m=+842.855376793" observedRunningTime="2026-03-10 16:02:26.638142879 +0000 UTC m=+843.760008566" watchObservedRunningTime="2026-03-10 16:02:26.652170233 +0000 UTC m=+843.774035920" Mar 10 16:02:27 crc kubenswrapper[4749]: I0310 16:02:27.616364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" event={"ID":"a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4","Type":"ContainerStarted","Data":"b919d2923374b7e894548bd21606ab6e9324ea899e6c5a2766086c3900e6e466"} Mar 10 16:02:27 crc kubenswrapper[4749]: I0310 16:02:27.633708 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-phg98" podStartSLOduration=2.738280998 podStartE2EDuration="5.633674718s" podCreationTimestamp="2026-03-10 16:02:22 +0000 UTC" firstStartedPulling="2026-03-10 16:02:23.762258733 +0000 UTC m=+840.884124420" lastFinishedPulling="2026-03-10 16:02:26.657652463 +0000 UTC m=+843.779518140" observedRunningTime="2026-03-10 16:02:27.629425391 +0000 UTC m=+844.751291078" watchObservedRunningTime="2026-03-10 16:02:27.633674718 +0000 UTC m=+844.755540405" Mar 10 16:02:28 crc kubenswrapper[4749]: I0310 16:02:28.626039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" event={"ID":"19dc9ec1-eeaa-4d4b-a800-cc90a945eef5","Type":"ContainerStarted","Data":"712d0265b73bb22c0112b193e230f8b41014991f0f255b3df38ac901406b070e"} Mar 10 16:02:28 crc kubenswrapper[4749]: I0310 16:02:28.664295 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-t8wjh" podStartSLOduration=1.501758584 podStartE2EDuration="6.664267474s" podCreationTimestamp="2026-03-10 16:02:22 +0000 UTC" firstStartedPulling="2026-03-10 16:02:23.12806687 +0000 UTC m=+840.249932557" lastFinishedPulling="2026-03-10 16:02:28.29057576 +0000 UTC m=+845.412441447" observedRunningTime="2026-03-10 16:02:28.650835646 +0000 UTC m=+845.772701373" watchObservedRunningTime="2026-03-10 16:02:28.664267474 +0000 UTC m=+845.786133161" Mar 10 16:02:32 crc kubenswrapper[4749]: I0310 16:02:32.810234 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-z4rkn" Mar 10 16:02:33 crc kubenswrapper[4749]: I0310 16:02:33.214818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:33 crc kubenswrapper[4749]: I0310 16:02:33.214873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:33 crc kubenswrapper[4749]: I0310 16:02:33.222868 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:33 crc kubenswrapper[4749]: I0310 16:02:33.665359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dd79b944c-wqxtv" Mar 10 16:02:33 crc kubenswrapper[4749]: I0310 16:02:33.728742 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q8p7p"] Mar 10 16:02:42 crc kubenswrapper[4749]: I0310 16:02:42.781003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gn4cf" Mar 10 16:02:50 crc kubenswrapper[4749]: I0310 16:02:50.981026 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:02:50 crc kubenswrapper[4749]: I0310 16:02:50.982094 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:02:53 crc kubenswrapper[4749]: I0310 16:02:53.082221 4749 scope.go:117] "RemoveContainer" containerID="c60152907d563889bb402382f1124b478ec87751301278e202a6fc1400fc617f" Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.875419 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb"] Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.877427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.891569 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb"] Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.892678 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.901542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.901653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k29v5\" (UniqueName: \"kubernetes.io/projected/b7a05570-e2b0-4d18-bad1-485091a3fdc5-kube-api-access-k29v5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:55 crc kubenswrapper[4749]: I0310 16:02:55.901749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.002638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.002715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k29v5\" (UniqueName: \"kubernetes.io/projected/b7a05570-e2b0-4d18-bad1-485091a3fdc5-kube-api-access-k29v5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.002775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.003263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.003293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.024445 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k29v5\" (UniqueName: \"kubernetes.io/projected/b7a05570-e2b0-4d18-bad1-485091a3fdc5-kube-api-access-k29v5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.195992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.443566 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb"] Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.809329 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerID="a3f8c3dd94b717cdc7f90c7ad7cc7530226aee947916021f99e56d53876508ff" exitCode=0 Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.809432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" event={"ID":"b7a05570-e2b0-4d18-bad1-485091a3fdc5","Type":"ContainerDied","Data":"a3f8c3dd94b717cdc7f90c7ad7cc7530226aee947916021f99e56d53876508ff"} Mar 10 16:02:56 crc kubenswrapper[4749]: I0310 16:02:56.810079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" event={"ID":"b7a05570-e2b0-4d18-bad1-485091a3fdc5","Type":"ContainerStarted","Data":"cbde0625704ae9a54de77eb7568e1f8c4e644a11fa759ae1c2283b4f0cc008bf"} Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.153876 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jjm7r"] Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.155729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.177886 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjm7r"] Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.335895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-catalog-content\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.335975 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-utilities\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.336007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96c29\" (UniqueName: \"kubernetes.io/projected/b88b8d54-9b85-44b4-a316-d9422165c46a-kube-api-access-96c29\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.436618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-utilities\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.436683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96c29\" (UniqueName: \"kubernetes.io/projected/b88b8d54-9b85-44b4-a316-d9422165c46a-kube-api-access-96c29\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.436766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-catalog-content\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.437468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-utilities\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.437502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-catalog-content\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.465532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96c29\" (UniqueName: \"kubernetes.io/projected/b88b8d54-9b85-44b4-a316-d9422165c46a-kube-api-access-96c29\") pod \"redhat-operators-jjm7r\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.471649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.662559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjm7r"] Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.779212 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q8p7p" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerName="console" containerID="cri-o://7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df" gracePeriod=15 Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.825307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerStarted","Data":"39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651"} Mar 10 16:02:58 crc kubenswrapper[4749]: I0310 16:02:58.825418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerStarted","Data":"6e451c52e61968a1c8542d1729f25a7ee048bdc62c94f87e02a003f95b1fc815"} Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.093491 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q8p7p_e9a7d78a-ab6f-456c-8433-5c1592d019c6/console/0.log" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.093557 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-trusted-ca-bundle\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-config\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-oauth-serving-cert\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-serving-cert\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-service-ca\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-oauth-config\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.248787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzr4f\" (UniqueName: \"kubernetes.io/projected/e9a7d78a-ab6f-456c-8433-5c1592d019c6-kube-api-access-tzr4f\") pod \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\" (UID: \"e9a7d78a-ab6f-456c-8433-5c1592d019c6\") " Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.249107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.249440 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.249728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-service-ca" (OuterVolumeSpecName: "service-ca") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.250110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-config" (OuterVolumeSpecName: "console-config") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.255822 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.256195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.257336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a7d78a-ab6f-456c-8433-5c1592d019c6-kube-api-access-tzr4f" (OuterVolumeSpecName: "kube-api-access-tzr4f") pod "e9a7d78a-ab6f-456c-8433-5c1592d019c6" (UID: "e9a7d78a-ab6f-456c-8433-5c1592d019c6"). InnerVolumeSpecName "kube-api-access-tzr4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.349899 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.349956 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.349967 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.349975 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.349984 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9a7d78a-ab6f-456c-8433-5c1592d019c6-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.349993 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9a7d78a-ab6f-456c-8433-5c1592d019c6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.350001 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzr4f\" (UniqueName: \"kubernetes.io/projected/e9a7d78a-ab6f-456c-8433-5c1592d019c6-kube-api-access-tzr4f\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.832169 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q8p7p_e9a7d78a-ab6f-456c-8433-5c1592d019c6/console/0.log" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.832497 4749 generic.go:334] "Generic (PLEG): container finished" podID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerID="7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df" exitCode=2 Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.832554 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q8p7p" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.832596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q8p7p" event={"ID":"e9a7d78a-ab6f-456c-8433-5c1592d019c6","Type":"ContainerDied","Data":"7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df"} Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.832843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q8p7p" event={"ID":"e9a7d78a-ab6f-456c-8433-5c1592d019c6","Type":"ContainerDied","Data":"562487b6cbbd4f453d481158bde8038970721f1e66464ac7d379d80fa7026a4d"} Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.832872 4749 scope.go:117] "RemoveContainer" containerID="7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.834655 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerID="f845e4b93bd1be014ee574d942bf45f3fa0f5f57ef88f51f004d33ee7e2fdbd7" exitCode=0 Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.834683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" event={"ID":"b7a05570-e2b0-4d18-bad1-485091a3fdc5","Type":"ContainerDied","Data":"f845e4b93bd1be014ee574d942bf45f3fa0f5f57ef88f51f004d33ee7e2fdbd7"} Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.839111 4749 generic.go:334] "Generic (PLEG): container finished" podID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerID="39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651" exitCode=0 Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.839183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerDied","Data":"39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651"} Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.857061 4749 scope.go:117] "RemoveContainer" containerID="7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df" Mar 10 16:02:59 crc kubenswrapper[4749]: E0310 16:02:59.858438 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df\": container with ID starting with 7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df not found: ID does not exist" containerID="7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.858549 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df"} err="failed to get container status \"7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df\": rpc error: code = NotFound desc = could not find container \"7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df\": container with ID starting with 7b67f59f1754136de0dd49df52fde9cccd8544313dc59bc46d78588b772cd7df not found: ID does not exist" Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.904849 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q8p7p"] Mar 10 16:02:59 crc kubenswrapper[4749]: I0310 16:02:59.908525 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q8p7p"] Mar 10 16:03:00 crc kubenswrapper[4749]: I0310 16:03:00.848335 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerID="715b05ed2be9ef189c69bf269e5b0cddfcbbb3557696bf20afa3ce7aa6e09fab" exitCode=0 Mar 10 16:03:00 crc kubenswrapper[4749]: I0310 16:03:00.848759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" event={"ID":"b7a05570-e2b0-4d18-bad1-485091a3fdc5","Type":"ContainerDied","Data":"715b05ed2be9ef189c69bf269e5b0cddfcbbb3557696bf20afa3ce7aa6e09fab"} Mar 10 16:03:01 crc kubenswrapper[4749]: I0310 16:03:01.621018 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" path="/var/lib/kubelet/pods/e9a7d78a-ab6f-456c-8433-5c1592d019c6/volumes" Mar 10 16:03:01 crc kubenswrapper[4749]: I0310 16:03:01.860135 4749 generic.go:334] "Generic (PLEG): container finished" podID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerID="3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e" exitCode=0 Mar 10 16:03:01 crc kubenswrapper[4749]: I0310 16:03:01.860191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerDied","Data":"3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e"} Mar 10 16:03:01 crc kubenswrapper[4749]: I0310 16:03:01.862918 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.121602 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.285886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-util\") pod \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.286037 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-bundle\") pod \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.286165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k29v5\" (UniqueName: \"kubernetes.io/projected/b7a05570-e2b0-4d18-bad1-485091a3fdc5-kube-api-access-k29v5\") pod \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\" (UID: \"b7a05570-e2b0-4d18-bad1-485091a3fdc5\") " Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.287961 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-bundle" (OuterVolumeSpecName: "bundle") pod "b7a05570-e2b0-4d18-bad1-485091a3fdc5" (UID: "b7a05570-e2b0-4d18-bad1-485091a3fdc5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.297066 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a05570-e2b0-4d18-bad1-485091a3fdc5-kube-api-access-k29v5" (OuterVolumeSpecName: "kube-api-access-k29v5") pod "b7a05570-e2b0-4d18-bad1-485091a3fdc5" (UID: "b7a05570-e2b0-4d18-bad1-485091a3fdc5"). InnerVolumeSpecName "kube-api-access-k29v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.311909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-util" (OuterVolumeSpecName: "util") pod "b7a05570-e2b0-4d18-bad1-485091a3fdc5" (UID: "b7a05570-e2b0-4d18-bad1-485091a3fdc5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.388360 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k29v5\" (UniqueName: \"kubernetes.io/projected/b7a05570-e2b0-4d18-bad1-485091a3fdc5-kube-api-access-k29v5\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.388453 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-util\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.388473 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a05570-e2b0-4d18-bad1-485091a3fdc5-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.868464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" event={"ID":"b7a05570-e2b0-4d18-bad1-485091a3fdc5","Type":"ContainerDied","Data":"cbde0625704ae9a54de77eb7568e1f8c4e644a11fa759ae1c2283b4f0cc008bf"} Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.868506 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbde0625704ae9a54de77eb7568e1f8c4e644a11fa759ae1c2283b4f0cc008bf" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.868477 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb" Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.871078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerStarted","Data":"88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2"} Mar 10 16:03:02 crc kubenswrapper[4749]: I0310 16:03:02.894090 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jjm7r" podStartSLOduration=2.3849838979999998 podStartE2EDuration="4.894074391s" podCreationTimestamp="2026-03-10 16:02:58 +0000 UTC" firstStartedPulling="2026-03-10 16:02:59.840158637 +0000 UTC m=+876.962024324" lastFinishedPulling="2026-03-10 16:03:02.34924909 +0000 UTC m=+879.471114817" observedRunningTime="2026-03-10 16:03:02.892918129 +0000 UTC m=+880.014783826" watchObservedRunningTime="2026-03-10 16:03:02.894074391 +0000 UTC m=+880.015940088" Mar 10 16:03:08 crc kubenswrapper[4749]: I0310 16:03:08.472494 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:03:08 crc kubenswrapper[4749]: I0310 16:03:08.472947 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:03:10 crc kubenswrapper[4749]: I0310 16:03:10.692460 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-5fc2g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 16:03:10 crc kubenswrapper[4749]: I0310 16:03:10.692876 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" podUID="739717e7-ef4a-4032-82be-88a95648f3fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:03:11 crc kubenswrapper[4749]: I0310 16:03:11.053578 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jjm7r" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="registry-server" probeResult="failure" output=< Mar 10 16:03:11 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 16:03:11 crc kubenswrapper[4749]: > Mar 10 16:03:11 crc kubenswrapper[4749]: E0310 16:03:11.060324 4749 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.454s" Mar 10 16:03:11 crc kubenswrapper[4749]: I0310 16:03:11.063512 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-5fc2g container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 16:03:11 crc kubenswrapper[4749]: I0310 16:03:11.063576 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-5fc2g" podUID="739717e7-ef4a-4032-82be-88a95648f3fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332344 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v"] Mar 10 16:03:13 crc kubenswrapper[4749]: E0310 16:03:13.332631 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="pull" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332646 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="pull" Mar 10 16:03:13 crc kubenswrapper[4749]: E0310 16:03:13.332662 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="util" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332669 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="util" Mar 10 16:03:13 crc kubenswrapper[4749]: E0310 16:03:13.332681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="extract" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332689 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="extract" Mar 10 16:03:13 crc kubenswrapper[4749]: E0310 16:03:13.332709 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerName="console" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332716 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerName="console" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332829 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9a7d78a-ab6f-456c-8433-5c1592d019c6" containerName="console" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.332848 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a05570-e2b0-4d18-bad1-485091a3fdc5" containerName="extract" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.333294 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.335235 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.335544 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.335561 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.335784 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.335798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-x84bs" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.352934 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v"] Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.479737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-webhook-cert\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.479853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qd2d\" (UniqueName: \"kubernetes.io/projected/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-kube-api-access-6qd2d\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.480512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-apiservice-cert\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.582816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-apiservice-cert\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.582913 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-webhook-cert\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.582960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qd2d\" (UniqueName: \"kubernetes.io/projected/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-kube-api-access-6qd2d\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.593314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-apiservice-cert\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.593342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-webhook-cert\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.609215 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qd2d\" (UniqueName: \"kubernetes.io/projected/2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2-kube-api-access-6qd2d\") pod \"metallb-operator-controller-manager-59548cf5bb-pnl5v\" (UID: \"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2\") " pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.649969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.662768 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx"] Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.663425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.665964 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.692043 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.692521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4w76f" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.692960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbzg\" (UniqueName: \"kubernetes.io/projected/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-kube-api-access-zwbzg\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.693059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-apiservice-cert\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.693093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-webhook-cert\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.694017 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx"] Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.795939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-webhook-cert\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.796410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbzg\" (UniqueName: \"kubernetes.io/projected/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-kube-api-access-zwbzg\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.796466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-apiservice-cert\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.806587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-webhook-cert\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.850358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-apiservice-cert\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:13 crc kubenswrapper[4749]: I0310 16:03:13.857700 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbzg\" (UniqueName: \"kubernetes.io/projected/3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95-kube-api-access-zwbzg\") pod \"metallb-operator-webhook-server-64c9df7c-wvtbx\" (UID: \"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95\") " pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:14 crc kubenswrapper[4749]: I0310 16:03:14.050280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:14 crc kubenswrapper[4749]: I0310 16:03:14.222861 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v"] Mar 10 16:03:14 crc kubenswrapper[4749]: W0310 16:03:14.242556 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab91d90_6c4b_46cb_9f09_eb6a2e1e6ad2.slice/crio-f3bf76d52458ae4ed801938f27a6264ced7548033b21730dbfa904a499657414 WatchSource:0}: Error finding container f3bf76d52458ae4ed801938f27a6264ced7548033b21730dbfa904a499657414: Status 404 returned error can't find the container with id f3bf76d52458ae4ed801938f27a6264ced7548033b21730dbfa904a499657414 Mar 10 16:03:14 crc kubenswrapper[4749]: I0310 16:03:14.327713 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx"] Mar 10 16:03:14 crc kubenswrapper[4749]: W0310 16:03:14.338556 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3538a56a_7b3d_4fd3_9d65_f2b9ecc9de95.slice/crio-25ef4d59a60fb4618e6f35445b670c05576fec6d1b7e4700e8a63b4c501be466 WatchSource:0}: Error finding container 25ef4d59a60fb4618e6f35445b670c05576fec6d1b7e4700e8a63b4c501be466: Status 404 returned error can't find the container with id 25ef4d59a60fb4618e6f35445b670c05576fec6d1b7e4700e8a63b4c501be466 Mar 10 16:03:15 crc kubenswrapper[4749]: I0310 16:03:15.091324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" event={"ID":"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95","Type":"ContainerStarted","Data":"25ef4d59a60fb4618e6f35445b670c05576fec6d1b7e4700e8a63b4c501be466"} Mar 10 16:03:15 crc kubenswrapper[4749]: I0310 16:03:15.092708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" event={"ID":"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2","Type":"ContainerStarted","Data":"f3bf76d52458ae4ed801938f27a6264ced7548033b21730dbfa904a499657414"} Mar 10 16:03:18 crc kubenswrapper[4749]: I0310 16:03:18.520258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:03:18 crc kubenswrapper[4749]: I0310 16:03:18.575935 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.150071 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjm7r"] Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.150756 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jjm7r" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="registry-server" containerID="cri-o://88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2" gracePeriod=2 Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.517017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.719893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96c29\" (UniqueName: \"kubernetes.io/projected/b88b8d54-9b85-44b4-a316-d9422165c46a-kube-api-access-96c29\") pod \"b88b8d54-9b85-44b4-a316-d9422165c46a\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.719957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-catalog-content\") pod \"b88b8d54-9b85-44b4-a316-d9422165c46a\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.720013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-utilities\") pod \"b88b8d54-9b85-44b4-a316-d9422165c46a\" (UID: \"b88b8d54-9b85-44b4-a316-d9422165c46a\") " Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.720957 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-utilities" (OuterVolumeSpecName: "utilities") pod "b88b8d54-9b85-44b4-a316-d9422165c46a" (UID: "b88b8d54-9b85-44b4-a316-d9422165c46a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.726719 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88b8d54-9b85-44b4-a316-d9422165c46a-kube-api-access-96c29" (OuterVolumeSpecName: "kube-api-access-96c29") pod "b88b8d54-9b85-44b4-a316-d9422165c46a" (UID: "b88b8d54-9b85-44b4-a316-d9422165c46a"). InnerVolumeSpecName "kube-api-access-96c29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.821568 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96c29\" (UniqueName: \"kubernetes.io/projected/b88b8d54-9b85-44b4-a316-d9422165c46a-kube-api-access-96c29\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.821607 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.850712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b88b8d54-9b85-44b4-a316-d9422165c46a" (UID: "b88b8d54-9b85-44b4-a316-d9422165c46a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.923730 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b88b8d54-9b85-44b4-a316-d9422165c46a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.981511 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:03:20 crc kubenswrapper[4749]: I0310 16:03:20.981611 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.134800 4749 generic.go:334] "Generic (PLEG): container finished" podID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerID="88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2" exitCode=0 Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.134849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerDied","Data":"88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2"} Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.134934 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjm7r" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.135366 4749 scope.go:117] "RemoveContainer" containerID="88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.135270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjm7r" event={"ID":"b88b8d54-9b85-44b4-a316-d9422165c46a","Type":"ContainerDied","Data":"6e451c52e61968a1c8542d1729f25a7ee048bdc62c94f87e02a003f95b1fc815"} Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.139233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" event={"ID":"3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95","Type":"ContainerStarted","Data":"bfe0356e974897e42c234b8f70c1f13b8d04547411d3f660f2198f7d75362132"} Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.139307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.146900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" event={"ID":"2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2","Type":"ContainerStarted","Data":"dfeda472b222c5134e43362683d0bdeecfb6bbd327425b0d898cd655cae9e311"} Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.147124 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.160926 4749 scope.go:117] "RemoveContainer" containerID="3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.176071 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" podStartSLOduration=2.354767031 podStartE2EDuration="8.176023205s" podCreationTimestamp="2026-03-10 16:03:13 +0000 UTC" firstStartedPulling="2026-03-10 16:03:14.340843611 +0000 UTC m=+891.462709298" lastFinishedPulling="2026-03-10 16:03:20.162099785 +0000 UTC m=+897.283965472" observedRunningTime="2026-03-10 16:03:21.168929451 +0000 UTC m=+898.290795138" watchObservedRunningTime="2026-03-10 16:03:21.176023205 +0000 UTC m=+898.297888892" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.186919 4749 scope.go:117] "RemoveContainer" containerID="39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.192502 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjm7r"] Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.196568 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jjm7r"] Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.214953 4749 scope.go:117] "RemoveContainer" containerID="88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2" Mar 10 16:03:21 crc kubenswrapper[4749]: E0310 16:03:21.215582 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2\": container with ID starting with 88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2 not found: ID does not exist" containerID="88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.215627 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2"} err="failed to get container status \"88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2\": rpc error: code = NotFound desc = could not find container \"88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2\": container with ID starting with 88a910b28a69fc863e6e19429330a3343046cd70676ab0fd4a8bef2f1a816fe2 not found: ID does not exist" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.215681 4749 scope.go:117] "RemoveContainer" containerID="3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e" Mar 10 16:03:21 crc kubenswrapper[4749]: E0310 16:03:21.216122 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e\": container with ID starting with 3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e not found: ID does not exist" containerID="3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.216163 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e"} err="failed to get container status \"3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e\": rpc error: code = NotFound desc = could not find container \"3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e\": container with ID starting with 3b802b684395308b62474fbecd27c88734ba07f2b981775f7295a7977e6c524e not found: ID does not exist" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.216184 4749 scope.go:117] "RemoveContainer" containerID="39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651" Mar 10 16:03:21 crc kubenswrapper[4749]: E0310 16:03:21.216480 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651\": container with ID starting with 39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651 not found: ID does not exist" containerID="39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.216500 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651"} err="failed to get container status \"39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651\": rpc error: code = NotFound desc = could not find container \"39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651\": container with ID starting with 39f9f24ec093aeb934ebb096919ed8eb74d396df7b02f8891f92ad67a0707651 not found: ID does not exist" Mar 10 16:03:21 crc kubenswrapper[4749]: I0310 16:03:21.618556 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" path="/var/lib/kubelet/pods/b88b8d54-9b85-44b4-a316-d9422165c46a/volumes" Mar 10 16:03:34 crc kubenswrapper[4749]: I0310 16:03:34.055493 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-64c9df7c-wvtbx" Mar 10 16:03:34 crc kubenswrapper[4749]: I0310 16:03:34.076564 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" podStartSLOduration=15.175498689 podStartE2EDuration="21.076533314s" podCreationTimestamp="2026-03-10 16:03:13 +0000 UTC" firstStartedPulling="2026-03-10 16:03:14.245019971 +0000 UTC m=+891.366885658" lastFinishedPulling="2026-03-10 16:03:20.146054596 +0000 UTC m=+897.267920283" observedRunningTime="2026-03-10 16:03:21.209601093 +0000 UTC m=+898.331466770" watchObservedRunningTime="2026-03-10 16:03:34.076533314 +0000 UTC m=+911.198399001" Mar 10 16:03:50 crc kubenswrapper[4749]: I0310 16:03:50.981727 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:03:50 crc kubenswrapper[4749]: I0310 16:03:50.982793 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:03:50 crc kubenswrapper[4749]: I0310 16:03:50.982868 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:03:50 crc kubenswrapper[4749]: I0310 16:03:50.983687 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4ea85a6b744107fd1b757efd6ea6aed1ac10e45ac86a77df1413fc6180c0184"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:03:50 crc kubenswrapper[4749]: I0310 16:03:50.983783 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://d4ea85a6b744107fd1b757efd6ea6aed1ac10e45ac86a77df1413fc6180c0184" gracePeriod=600 Mar 10 16:03:51 crc kubenswrapper[4749]: I0310 16:03:51.350232 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="d4ea85a6b744107fd1b757efd6ea6aed1ac10e45ac86a77df1413fc6180c0184" exitCode=0 Mar 10 16:03:51 crc kubenswrapper[4749]: I0310 16:03:51.350345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"d4ea85a6b744107fd1b757efd6ea6aed1ac10e45ac86a77df1413fc6180c0184"} Mar 10 16:03:51 crc kubenswrapper[4749]: I0310 16:03:51.350852 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"8df1beddcbbe4b28bedf74a692eb90fcfbb0b66981e27d81e53ac5b8485c3d4f"} Mar 10 16:03:51 crc kubenswrapper[4749]: I0310 16:03:51.350884 4749 scope.go:117] "RemoveContainer" containerID="7b78ae72f8895fc1df287649b2d990626337b8a539e3e03d294824b60e7e24d6" Mar 10 16:03:53 crc kubenswrapper[4749]: I0310 16:03:53.652882 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-59548cf5bb-pnl5v" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.369831 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wxp7g"] Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.370190 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="extract-content" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.370214 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="extract-content" Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.370235 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="registry-server" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.370242 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="registry-server" Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.370256 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="extract-utilities" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.370264 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="extract-utilities" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.370498 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88b8d54-9b85-44b4-a316-d9422165c46a" containerName="registry-server" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.372786 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.374994 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.375257 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.375564 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-llrb6" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.392984 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv"] Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.394227 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.404266 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.404491 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv"] Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.496815 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cmdn5"] Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.502307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-metrics-certs\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508348 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/45b99996-1ced-47bc-a309-4195e4880944-frr-startup\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-metrics\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-frr-sockets\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts8cj\" (UniqueName: \"kubernetes.io/projected/ee72aaa3-2e8c-41d1-ae7e-c446e531300a-kube-api-access-ts8cj\") pod \"frr-k8s-webhook-server-7f989f654f-s98fv\" (UID: \"ee72aaa3-2e8c-41d1-ae7e-c446e531300a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-reloader\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508503 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q22w\" (UniqueName: \"kubernetes.io/projected/150f068d-c570-4301-8ea6-aceb34c0f84b-kube-api-access-8q22w\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/150f068d-c570-4301-8ea6-aceb34c0f84b-metallb-excludel2\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-frr-conf\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508565 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee72aaa3-2e8c-41d1-ae7e-c446e531300a-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s98fv\" (UID: \"ee72aaa3-2e8c-41d1-ae7e-c446e531300a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfxjj\" (UniqueName: \"kubernetes.io/projected/45b99996-1ced-47bc-a309-4195e4880944-kube-api-access-vfxjj\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.508606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45b99996-1ced-47bc-a309-4195e4880944-metrics-certs\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.512423 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-n8mpk"] Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.513359 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.513893 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.514046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.514191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.517687 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.518085 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-p76ps" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.600991 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-n8mpk"] Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30390e60-a7e2-4abc-b7d6-2384bd758fdd-cert\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/45b99996-1ced-47bc-a309-4195e4880944-frr-startup\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-metrics\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-frr-sockets\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts8cj\" (UniqueName: \"kubernetes.io/projected/ee72aaa3-2e8c-41d1-ae7e-c446e531300a-kube-api-access-ts8cj\") pod \"frr-k8s-webhook-server-7f989f654f-s98fv\" (UID: \"ee72aaa3-2e8c-41d1-ae7e-c446e531300a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609398 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-reloader\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzkg\" (UniqueName: \"kubernetes.io/projected/30390e60-a7e2-4abc-b7d6-2384bd758fdd-kube-api-access-5vzkg\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q22w\" (UniqueName: \"kubernetes.io/projected/150f068d-c570-4301-8ea6-aceb34c0f84b-kube-api-access-8q22w\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/150f068d-c570-4301-8ea6-aceb34c0f84b-metallb-excludel2\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-frr-conf\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee72aaa3-2e8c-41d1-ae7e-c446e531300a-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s98fv\" (UID: \"ee72aaa3-2e8c-41d1-ae7e-c446e531300a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfxjj\" (UniqueName: \"kubernetes.io/projected/45b99996-1ced-47bc-a309-4195e4880944-kube-api-access-vfxjj\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30390e60-a7e2-4abc-b7d6-2384bd758fdd-metrics-certs\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45b99996-1ced-47bc-a309-4195e4880944-metrics-certs\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-metrics-certs\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.609627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.609733 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.609797 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist podName:150f068d-c570-4301-8ea6-aceb34c0f84b nodeName:}" failed. No retries permitted until 2026-03-10 16:03:55.109761765 +0000 UTC m=+932.231627452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist") pod "speaker-cmdn5" (UID: "150f068d-c570-4301-8ea6-aceb34c0f84b") : secret "metallb-memberlist" not found Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.610403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-frr-sockets\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.610884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-reloader\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.610928 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/45b99996-1ced-47bc-a309-4195e4880944-frr-startup\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.611131 4749 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.611177 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45b99996-1ced-47bc-a309-4195e4880944-metrics-certs podName:45b99996-1ced-47bc-a309-4195e4880944 nodeName:}" failed. No retries permitted until 2026-03-10 16:03:55.111153773 +0000 UTC m=+932.233019450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45b99996-1ced-47bc-a309-4195e4880944-metrics-certs") pod "frr-k8s-wxp7g" (UID: "45b99996-1ced-47bc-a309-4195e4880944") : secret "frr-k8s-certs-secret" not found Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.611216 4749 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 10 16:03:54 crc kubenswrapper[4749]: E0310 16:03:54.611254 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-metrics-certs podName:150f068d-c570-4301-8ea6-aceb34c0f84b nodeName:}" failed. No retries permitted until 2026-03-10 16:03:55.111231395 +0000 UTC m=+932.233097082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-metrics-certs") pod "speaker-cmdn5" (UID: "150f068d-c570-4301-8ea6-aceb34c0f84b") : secret "speaker-certs-secret" not found Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.611672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-frr-conf\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.611806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/150f068d-c570-4301-8ea6-aceb34c0f84b-metallb-excludel2\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.612011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/45b99996-1ced-47bc-a309-4195e4880944-metrics\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.625324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee72aaa3-2e8c-41d1-ae7e-c446e531300a-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s98fv\" (UID: \"ee72aaa3-2e8c-41d1-ae7e-c446e531300a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.634023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts8cj\" (UniqueName: \"kubernetes.io/projected/ee72aaa3-2e8c-41d1-ae7e-c446e531300a-kube-api-access-ts8cj\") pod \"frr-k8s-webhook-server-7f989f654f-s98fv\" (UID: \"ee72aaa3-2e8c-41d1-ae7e-c446e531300a\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.656964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfxjj\" (UniqueName: \"kubernetes.io/projected/45b99996-1ced-47bc-a309-4195e4880944-kube-api-access-vfxjj\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.657455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q22w\" (UniqueName: \"kubernetes.io/projected/150f068d-c570-4301-8ea6-aceb34c0f84b-kube-api-access-8q22w\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.710428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30390e60-a7e2-4abc-b7d6-2384bd758fdd-metrics-certs\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.710598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30390e60-a7e2-4abc-b7d6-2384bd758fdd-cert\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.710656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzkg\" (UniqueName: \"kubernetes.io/projected/30390e60-a7e2-4abc-b7d6-2384bd758fdd-kube-api-access-5vzkg\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.714656 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.714849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.715051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30390e60-a7e2-4abc-b7d6-2384bd758fdd-metrics-certs\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.725033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30390e60-a7e2-4abc-b7d6-2384bd758fdd-cert\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.726824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzkg\" (UniqueName: \"kubernetes.io/projected/30390e60-a7e2-4abc-b7d6-2384bd758fdd-kube-api-access-5vzkg\") pod \"controller-86ddb6bd46-n8mpk\" (UID: \"30390e60-a7e2-4abc-b7d6-2384bd758fdd\") " pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.906155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:54 crc kubenswrapper[4749]: I0310 16:03:54.951320 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv"] Mar 10 16:03:54 crc kubenswrapper[4749]: W0310 16:03:54.961853 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee72aaa3_2e8c_41d1_ae7e_c446e531300a.slice/crio-f5350904369022193413646a10ab1555f389110ecaa043b613280c7ec45e48c0 WatchSource:0}: Error finding container f5350904369022193413646a10ab1555f389110ecaa043b613280c7ec45e48c0: Status 404 returned error can't find the container with id f5350904369022193413646a10ab1555f389110ecaa043b613280c7ec45e48c0 Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.107836 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-n8mpk"] Mar 10 16:03:55 crc kubenswrapper[4749]: W0310 16:03:55.113087 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30390e60_a7e2_4abc_b7d6_2384bd758fdd.slice/crio-c163fe920dafc286fa9244e0c2e5ba4b05b5f848c132b9cf31f72537e3819db7 WatchSource:0}: Error finding container c163fe920dafc286fa9244e0c2e5ba4b05b5f848c132b9cf31f72537e3819db7: Status 404 returned error can't find the container with id c163fe920dafc286fa9244e0c2e5ba4b05b5f848c132b9cf31f72537e3819db7 Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.114517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45b99996-1ced-47bc-a309-4195e4880944-metrics-certs\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.114575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-metrics-certs\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.114612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:55 crc kubenswrapper[4749]: E0310 16:03:55.114793 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 16:03:55 crc kubenswrapper[4749]: E0310 16:03:55.114864 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist podName:150f068d-c570-4301-8ea6-aceb34c0f84b nodeName:}" failed. No retries permitted until 2026-03-10 16:03:56.114843699 +0000 UTC m=+933.236709386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist") pod "speaker-cmdn5" (UID: "150f068d-c570-4301-8ea6-aceb34c0f84b") : secret "metallb-memberlist" not found Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.118821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45b99996-1ced-47bc-a309-4195e4880944-metrics-certs\") pod \"frr-k8s-wxp7g\" (UID: \"45b99996-1ced-47bc-a309-4195e4880944\") " pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.120512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-metrics-certs\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.295665 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.382815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n8mpk" event={"ID":"30390e60-a7e2-4abc-b7d6-2384bd758fdd","Type":"ContainerStarted","Data":"1bb0a21c9fdda894b9406c606b5a56a0d51c613ede86f1a8a21fa4aec7571ea1"} Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.382867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n8mpk" event={"ID":"30390e60-a7e2-4abc-b7d6-2384bd758fdd","Type":"ContainerStarted","Data":"daa3fbde2467fc1492e2d99bdb02702ffcc2afbe263d08799c6404f3f09d0391"} Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.382882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n8mpk" event={"ID":"30390e60-a7e2-4abc-b7d6-2384bd758fdd","Type":"ContainerStarted","Data":"c163fe920dafc286fa9244e0c2e5ba4b05b5f848c132b9cf31f72537e3819db7"} Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.382982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.385627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" event={"ID":"ee72aaa3-2e8c-41d1-ae7e-c446e531300a","Type":"ContainerStarted","Data":"f5350904369022193413646a10ab1555f389110ecaa043b613280c7ec45e48c0"} Mar 10 16:03:55 crc kubenswrapper[4749]: I0310 16:03:55.403074 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-n8mpk" podStartSLOduration=1.403045115 podStartE2EDuration="1.403045115s" podCreationTimestamp="2026-03-10 16:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:03:55.395566811 +0000 UTC m=+932.517432498" watchObservedRunningTime="2026-03-10 16:03:55.403045115 +0000 UTC m=+932.524910802" Mar 10 16:03:56 crc kubenswrapper[4749]: I0310 16:03:56.132611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:56 crc kubenswrapper[4749]: I0310 16:03:56.141499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/150f068d-c570-4301-8ea6-aceb34c0f84b-memberlist\") pod \"speaker-cmdn5\" (UID: \"150f068d-c570-4301-8ea6-aceb34c0f84b\") " pod="metallb-system/speaker-cmdn5" Mar 10 16:03:56 crc kubenswrapper[4749]: I0310 16:03:56.343875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cmdn5" Mar 10 16:03:56 crc kubenswrapper[4749]: W0310 16:03:56.387472 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150f068d_c570_4301_8ea6_aceb34c0f84b.slice/crio-43e768f52c9119deb1a11389637915556f6f2ba9a8de950393944c9981cf1572 WatchSource:0}: Error finding container 43e768f52c9119deb1a11389637915556f6f2ba9a8de950393944c9981cf1572: Status 404 returned error can't find the container with id 43e768f52c9119deb1a11389637915556f6f2ba9a8de950393944c9981cf1572 Mar 10 16:03:56 crc kubenswrapper[4749]: I0310 16:03:56.400559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"3bd647442d9eed2b22e879baa4cd0043793b270a3b5e15826575b027dae6e810"} Mar 10 16:03:57 crc kubenswrapper[4749]: I0310 16:03:57.410079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cmdn5" event={"ID":"150f068d-c570-4301-8ea6-aceb34c0f84b","Type":"ContainerStarted","Data":"fc5db6eb448d91148da94cc18cef76eb2f7966bc02185ffd84c48b1ca170fd1a"} Mar 10 16:03:57 crc kubenswrapper[4749]: I0310 16:03:57.410361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cmdn5" event={"ID":"150f068d-c570-4301-8ea6-aceb34c0f84b","Type":"ContainerStarted","Data":"75c1b3402512e49483ab5f6de1ba721330173649bd7550baac7163e33d83cebb"} Mar 10 16:03:57 crc kubenswrapper[4749]: I0310 16:03:57.410386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cmdn5" event={"ID":"150f068d-c570-4301-8ea6-aceb34c0f84b","Type":"ContainerStarted","Data":"43e768f52c9119deb1a11389637915556f6f2ba9a8de950393944c9981cf1572"} Mar 10 16:03:57 crc kubenswrapper[4749]: I0310 16:03:57.410579 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cmdn5" Mar 10 16:03:57 crc kubenswrapper[4749]: I0310 16:03:57.446778 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cmdn5" podStartSLOduration=3.446739419 podStartE2EDuration="3.446739419s" podCreationTimestamp="2026-03-10 16:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:03:57.442466142 +0000 UTC m=+934.564331829" watchObservedRunningTime="2026-03-10 16:03:57.446739419 +0000 UTC m=+934.568605096" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.127275 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552644-9h9tm"] Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.129053 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.131709 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.131734 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.132009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.137000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-9h9tm"] Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.191856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6nb\" (UniqueName: \"kubernetes.io/projected/275c8c2e-026a-4773-b021-58644290e646-kube-api-access-4d6nb\") pod \"auto-csr-approver-29552644-9h9tm\" (UID: \"275c8c2e-026a-4773-b021-58644290e646\") " pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.293080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6nb\" (UniqueName: \"kubernetes.io/projected/275c8c2e-026a-4773-b021-58644290e646-kube-api-access-4d6nb\") pod \"auto-csr-approver-29552644-9h9tm\" (UID: \"275c8c2e-026a-4773-b021-58644290e646\") " pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.315355 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6nb\" (UniqueName: \"kubernetes.io/projected/275c8c2e-026a-4773-b021-58644290e646-kube-api-access-4d6nb\") pod \"auto-csr-approver-29552644-9h9tm\" (UID: \"275c8c2e-026a-4773-b021-58644290e646\") " pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:00 crc kubenswrapper[4749]: I0310 16:04:00.454708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.285933 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-9h9tm"] Mar 10 16:04:02 crc kubenswrapper[4749]: W0310 16:04:02.289694 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod275c8c2e_026a_4773_b021_58644290e646.slice/crio-ad924b3b6d7ecebce6c29f6246ddbb57e87e70a47a78b7be02b34391213e3a98 WatchSource:0}: Error finding container ad924b3b6d7ecebce6c29f6246ddbb57e87e70a47a78b7be02b34391213e3a98: Status 404 returned error can't find the container with id ad924b3b6d7ecebce6c29f6246ddbb57e87e70a47a78b7be02b34391213e3a98 Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.449963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" event={"ID":"275c8c2e-026a-4773-b021-58644290e646","Type":"ContainerStarted","Data":"ad924b3b6d7ecebce6c29f6246ddbb57e87e70a47a78b7be02b34391213e3a98"} Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.451566 4749 generic.go:334] "Generic (PLEG): container finished" podID="45b99996-1ced-47bc-a309-4195e4880944" containerID="5f8d19a4c8cb7a16047d0ba418fd008c6790eca5a2652f498311d0ab03b2d7b3" exitCode=0 Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.451686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerDied","Data":"5f8d19a4c8cb7a16047d0ba418fd008c6790eca5a2652f498311d0ab03b2d7b3"} Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.452835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" event={"ID":"ee72aaa3-2e8c-41d1-ae7e-c446e531300a","Type":"ContainerStarted","Data":"1847b0edb71fdfb7ce410ef536aa79b6c236d77cb8ccd2806c851034f1866b0b"} Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.452992 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:04:02 crc kubenswrapper[4749]: I0310 16:04:02.508115 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" podStartSLOduration=1.480331867 podStartE2EDuration="8.508091735s" podCreationTimestamp="2026-03-10 16:03:54 +0000 UTC" firstStartedPulling="2026-03-10 16:03:54.964978363 +0000 UTC m=+932.086844050" lastFinishedPulling="2026-03-10 16:04:01.992738231 +0000 UTC m=+939.114603918" observedRunningTime="2026-03-10 16:04:02.497101865 +0000 UTC m=+939.618967592" watchObservedRunningTime="2026-03-10 16:04:02.508091735 +0000 UTC m=+939.629957422" Mar 10 16:04:03 crc kubenswrapper[4749]: I0310 16:04:03.463214 4749 generic.go:334] "Generic (PLEG): container finished" podID="45b99996-1ced-47bc-a309-4195e4880944" containerID="33f9a3650dcd4d68b63c9dc51f949be4425f50f1c9b84a981fcb41262c56a386" exitCode=0 Mar 10 16:04:03 crc kubenswrapper[4749]: I0310 16:04:03.463480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerDied","Data":"33f9a3650dcd4d68b63c9dc51f949be4425f50f1c9b84a981fcb41262c56a386"} Mar 10 16:04:04 crc kubenswrapper[4749]: I0310 16:04:04.474446 4749 generic.go:334] "Generic (PLEG): container finished" podID="45b99996-1ced-47bc-a309-4195e4880944" containerID="92b3804427cc36b38176f064e52d56e68e1e76f0d85669150db502bcd41fad22" exitCode=0 Mar 10 16:04:04 crc kubenswrapper[4749]: I0310 16:04:04.474510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerDied","Data":"92b3804427cc36b38176f064e52d56e68e1e76f0d85669150db502bcd41fad22"} Mar 10 16:04:04 crc kubenswrapper[4749]: I0310 16:04:04.477419 4749 generic.go:334] "Generic (PLEG): container finished" podID="275c8c2e-026a-4773-b021-58644290e646" containerID="a38bdddb79d11597cebedbefa552e2cec753c79dee4ea9ffef4c2aee1e2733b9" exitCode=0 Mar 10 16:04:04 crc kubenswrapper[4749]: I0310 16:04:04.477475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" event={"ID":"275c8c2e-026a-4773-b021-58644290e646","Type":"ContainerDied","Data":"a38bdddb79d11597cebedbefa552e2cec753c79dee4ea9ffef4c2aee1e2733b9"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"d50cc3b85808cbf1db231adb5d840df76853e9897d8e49b5d22dab9fe14c0379"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"c4bbaf2e14420fb15cb7ae889086dfc5810fedb7f019419a1fae8aa2f0a69824"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"68970bb3bad3f24eb2db7bfdfe05ce619747dc67db2a3f765f4898b2d501c78a"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"d7aaca07e6c2fdbe56d00063940938c48da93e039728ccbe5491c7540ade80c1"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"9aa25ac663138ec0bcb3b810a5ff6b41d44f10e775fc99ff22fbfe8ed4644f6a"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wxp7g" event={"ID":"45b99996-1ced-47bc-a309-4195e4880944","Type":"ContainerStarted","Data":"632da3451705a7fbaa7c98a59da1944a063beb0327d4bf7898f496398c974b20"} Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.492586 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.519189 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wxp7g" podStartSLOduration=4.885560902 podStartE2EDuration="11.519164807s" podCreationTimestamp="2026-03-10 16:03:54 +0000 UTC" firstStartedPulling="2026-03-10 16:03:55.385583418 +0000 UTC m=+932.507449105" lastFinishedPulling="2026-03-10 16:04:02.019187323 +0000 UTC m=+939.141053010" observedRunningTime="2026-03-10 16:04:05.513845772 +0000 UTC m=+942.635711479" watchObservedRunningTime="2026-03-10 16:04:05.519164807 +0000 UTC m=+942.641030494" Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.768169 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.882839 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d6nb\" (UniqueName: \"kubernetes.io/projected/275c8c2e-026a-4773-b021-58644290e646-kube-api-access-4d6nb\") pod \"275c8c2e-026a-4773-b021-58644290e646\" (UID: \"275c8c2e-026a-4773-b021-58644290e646\") " Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.891270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275c8c2e-026a-4773-b021-58644290e646-kube-api-access-4d6nb" (OuterVolumeSpecName: "kube-api-access-4d6nb") pod "275c8c2e-026a-4773-b021-58644290e646" (UID: "275c8c2e-026a-4773-b021-58644290e646"). InnerVolumeSpecName "kube-api-access-4d6nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:05 crc kubenswrapper[4749]: I0310 16:04:05.985302 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d6nb\" (UniqueName: \"kubernetes.io/projected/275c8c2e-026a-4773-b021-58644290e646-kube-api-access-4d6nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:06 crc kubenswrapper[4749]: I0310 16:04:06.348743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cmdn5" Mar 10 16:04:06 crc kubenswrapper[4749]: I0310 16:04:06.501045 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" Mar 10 16:04:06 crc kubenswrapper[4749]: I0310 16:04:06.501043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-9h9tm" event={"ID":"275c8c2e-026a-4773-b021-58644290e646","Type":"ContainerDied","Data":"ad924b3b6d7ecebce6c29f6246ddbb57e87e70a47a78b7be02b34391213e3a98"} Mar 10 16:04:06 crc kubenswrapper[4749]: I0310 16:04:06.502079 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad924b3b6d7ecebce6c29f6246ddbb57e87e70a47a78b7be02b34391213e3a98" Mar 10 16:04:06 crc kubenswrapper[4749]: I0310 16:04:06.817698 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-2j4cg"] Mar 10 16:04:06 crc kubenswrapper[4749]: I0310 16:04:06.823601 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-2j4cg"] Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.617945 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de41b32f-3fbe-43cc-b8fd-7dc121e0d686" path="/var/lib/kubelet/pods/de41b32f-3fbe-43cc-b8fd-7dc121e0d686/volumes" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.708971 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd"] Mar 10 16:04:07 crc kubenswrapper[4749]: E0310 16:04:07.709211 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275c8c2e-026a-4773-b021-58644290e646" containerName="oc" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.709225 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="275c8c2e-026a-4773-b021-58644290e646" containerName="oc" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.709359 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="275c8c2e-026a-4773-b021-58644290e646" containerName="oc" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.710120 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.712757 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.725019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd"] Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.805429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpcbx\" (UniqueName: \"kubernetes.io/projected/8225dfe7-8f9f-4460-a2e1-800f515e9021-kube-api-access-qpcbx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.805563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.805630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.906809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.906878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpcbx\" (UniqueName: \"kubernetes.io/projected/8225dfe7-8f9f-4460-a2e1-800f515e9021-kube-api-access-qpcbx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.906923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.907427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.907458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:07 crc kubenswrapper[4749]: I0310 16:04:07.926402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpcbx\" (UniqueName: \"kubernetes.io/projected/8225dfe7-8f9f-4460-a2e1-800f515e9021-kube-api-access-qpcbx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:08 crc kubenswrapper[4749]: I0310 16:04:08.023791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:08 crc kubenswrapper[4749]: I0310 16:04:08.218223 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd"] Mar 10 16:04:08 crc kubenswrapper[4749]: I0310 16:04:08.532341 4749 generic.go:334] "Generic (PLEG): container finished" podID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerID="5b1d7199cd525c3addefb9fac9c2b71a91aff7fcd08c904f6a5216ed3cd6b51e" exitCode=0 Mar 10 16:04:08 crc kubenswrapper[4749]: I0310 16:04:08.532428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" event={"ID":"8225dfe7-8f9f-4460-a2e1-800f515e9021","Type":"ContainerDied","Data":"5b1d7199cd525c3addefb9fac9c2b71a91aff7fcd08c904f6a5216ed3cd6b51e"} Mar 10 16:04:08 crc kubenswrapper[4749]: I0310 16:04:08.532458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" event={"ID":"8225dfe7-8f9f-4460-a2e1-800f515e9021","Type":"ContainerStarted","Data":"8f2e85d2008e43edc22a7224683bf2f597b3043a744905efc8acf27c7bad62b3"} Mar 10 16:04:10 crc kubenswrapper[4749]: I0310 16:04:10.295980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:04:10 crc kubenswrapper[4749]: I0310 16:04:10.372109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:04:13 crc kubenswrapper[4749]: I0310 16:04:13.575813 4749 generic.go:334] "Generic (PLEG): container finished" podID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerID="2e4399eb9452d0dcdd6d3d8193a5ab4f9ee4c2840c307296e4f9b9106381cdc3" exitCode=0 Mar 10 16:04:13 crc kubenswrapper[4749]: I0310 16:04:13.575962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" event={"ID":"8225dfe7-8f9f-4460-a2e1-800f515e9021","Type":"ContainerDied","Data":"2e4399eb9452d0dcdd6d3d8193a5ab4f9ee4c2840c307296e4f9b9106381cdc3"} Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.046900 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h8s9z"] Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.048690 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.063264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8s9z"] Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.104697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lskkp\" (UniqueName: \"kubernetes.io/projected/b4e24437-6367-4603-b923-d97a4e2e737a-kube-api-access-lskkp\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.104765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-catalog-content\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.104789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-utilities\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.206484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lskkp\" (UniqueName: \"kubernetes.io/projected/b4e24437-6367-4603-b923-d97a4e2e737a-kube-api-access-lskkp\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.206560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-catalog-content\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.206587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-utilities\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.207186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-utilities\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.207208 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-catalog-content\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.230958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lskkp\" (UniqueName: \"kubernetes.io/projected/b4e24437-6367-4603-b923-d97a4e2e737a-kube-api-access-lskkp\") pod \"certified-operators-h8s9z\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.397878 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.589598 4749 generic.go:334] "Generic (PLEG): container finished" podID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerID="6e33d771eac33ab02e5ed4523533ded86312dd04b0ca20fd3c2c5d0892ef9519" exitCode=0 Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.589920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" event={"ID":"8225dfe7-8f9f-4460-a2e1-800f515e9021","Type":"ContainerDied","Data":"6e33d771eac33ab02e5ed4523533ded86312dd04b0ca20fd3c2c5d0892ef9519"} Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.618599 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8s9z"] Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.721725 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s98fv" Mar 10 16:04:14 crc kubenswrapper[4749]: I0310 16:04:14.917052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-n8mpk" Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.300583 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wxp7g" Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.599397 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4e24437-6367-4603-b923-d97a4e2e737a" containerID="09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d" exitCode=0 Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.599680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerDied","Data":"09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d"} Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.599714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerStarted","Data":"ba3092f0e289ea70d7efb1901a3188d693a790153eb61e2cef1176c61e826ad3"} Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.886758 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.940089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-bundle\") pod \"8225dfe7-8f9f-4460-a2e1-800f515e9021\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.940145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-util\") pod \"8225dfe7-8f9f-4460-a2e1-800f515e9021\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.940212 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpcbx\" (UniqueName: \"kubernetes.io/projected/8225dfe7-8f9f-4460-a2e1-800f515e9021-kube-api-access-qpcbx\") pod \"8225dfe7-8f9f-4460-a2e1-800f515e9021\" (UID: \"8225dfe7-8f9f-4460-a2e1-800f515e9021\") " Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.941599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-bundle" (OuterVolumeSpecName: "bundle") pod "8225dfe7-8f9f-4460-a2e1-800f515e9021" (UID: "8225dfe7-8f9f-4460-a2e1-800f515e9021"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.948777 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8225dfe7-8f9f-4460-a2e1-800f515e9021-kube-api-access-qpcbx" (OuterVolumeSpecName: "kube-api-access-qpcbx") pod "8225dfe7-8f9f-4460-a2e1-800f515e9021" (UID: "8225dfe7-8f9f-4460-a2e1-800f515e9021"). InnerVolumeSpecName "kube-api-access-qpcbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:15 crc kubenswrapper[4749]: I0310 16:04:15.957621 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-util" (OuterVolumeSpecName: "util") pod "8225dfe7-8f9f-4460-a2e1-800f515e9021" (UID: "8225dfe7-8f9f-4460-a2e1-800f515e9021"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.041781 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.042019 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8225dfe7-8f9f-4460-a2e1-800f515e9021-util\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.042103 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpcbx\" (UniqueName: \"kubernetes.io/projected/8225dfe7-8f9f-4460-a2e1-800f515e9021-kube-api-access-qpcbx\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.609859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" event={"ID":"8225dfe7-8f9f-4460-a2e1-800f515e9021","Type":"ContainerDied","Data":"8f2e85d2008e43edc22a7224683bf2f597b3043a744905efc8acf27c7bad62b3"} Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.610201 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f2e85d2008e43edc22a7224683bf2f597b3043a744905efc8acf27c7bad62b3" Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.609915 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd" Mar 10 16:04:16 crc kubenswrapper[4749]: I0310 16:04:16.612673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerStarted","Data":"ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea"} Mar 10 16:04:17 crc kubenswrapper[4749]: I0310 16:04:17.622902 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4e24437-6367-4603-b923-d97a4e2e737a" containerID="ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea" exitCode=0 Mar 10 16:04:17 crc kubenswrapper[4749]: I0310 16:04:17.622951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerDied","Data":"ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea"} Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.253173 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7lt2k"] Mar 10 16:04:18 crc kubenswrapper[4749]: E0310 16:04:18.253716 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="util" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.253750 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="util" Mar 10 16:04:18 crc kubenswrapper[4749]: E0310 16:04:18.253766 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="extract" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.253777 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="extract" Mar 10 16:04:18 crc kubenswrapper[4749]: E0310 16:04:18.253794 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="pull" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.253802 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="pull" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.253947 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8225dfe7-8f9f-4460-a2e1-800f515e9021" containerName="extract" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.254938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.269201 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lt2k"] Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.373191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjknr\" (UniqueName: \"kubernetes.io/projected/8fc8418d-5f19-4b91-b4f4-a71464be433a-kube-api-access-zjknr\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.373257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-catalog-content\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.373298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-utilities\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.474452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-catalog-content\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.474527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-utilities\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.474590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjknr\" (UniqueName: \"kubernetes.io/projected/8fc8418d-5f19-4b91-b4f4-a71464be433a-kube-api-access-zjknr\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.475466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-catalog-content\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.475748 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-utilities\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.495425 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjknr\" (UniqueName: \"kubernetes.io/projected/8fc8418d-5f19-4b91-b4f4-a71464be433a-kube-api-access-zjknr\") pod \"community-operators-7lt2k\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.572362 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.638657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerStarted","Data":"787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0"} Mar 10 16:04:18 crc kubenswrapper[4749]: I0310 16:04:18.709091 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h8s9z" podStartSLOduration=2.289392597 podStartE2EDuration="4.709069825s" podCreationTimestamp="2026-03-10 16:04:14 +0000 UTC" firstStartedPulling="2026-03-10 16:04:15.601507786 +0000 UTC m=+952.723373493" lastFinishedPulling="2026-03-10 16:04:18.021185014 +0000 UTC m=+955.143050721" observedRunningTime="2026-03-10 16:04:18.706306249 +0000 UTC m=+955.828171936" watchObservedRunningTime="2026-03-10 16:04:18.709069825 +0000 UTC m=+955.830935512" Mar 10 16:04:19 crc kubenswrapper[4749]: I0310 16:04:19.333213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lt2k"] Mar 10 16:04:19 crc kubenswrapper[4749]: I0310 16:04:19.646442 4749 generic.go:334] "Generic (PLEG): container finished" podID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerID="a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8" exitCode=0 Mar 10 16:04:19 crc kubenswrapper[4749]: I0310 16:04:19.646579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lt2k" event={"ID":"8fc8418d-5f19-4b91-b4f4-a71464be433a","Type":"ContainerDied","Data":"a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8"} Mar 10 16:04:19 crc kubenswrapper[4749]: I0310 16:04:19.646650 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lt2k" event={"ID":"8fc8418d-5f19-4b91-b4f4-a71464be433a","Type":"ContainerStarted","Data":"9dae14aa5a7ac3052ddba8241949e12fe7dc6e2588e5e23d78978b6316ef991c"} Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.511051 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs"] Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.511763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.513443 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-zsfwh" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.514003 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.514041 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.533250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs"] Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.603874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn25h\" (UniqueName: \"kubernetes.io/projected/fbf4c349-bbda-4618-a260-2ffea91f2869-kube-api-access-vn25h\") pod \"cert-manager-operator-controller-manager-66c8bdd694-ch4rs\" (UID: \"fbf4c349-bbda-4618-a260-2ffea91f2869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.603926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbf4c349-bbda-4618-a260-2ffea91f2869-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-ch4rs\" (UID: \"fbf4c349-bbda-4618-a260-2ffea91f2869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.705712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn25h\" (UniqueName: \"kubernetes.io/projected/fbf4c349-bbda-4618-a260-2ffea91f2869-kube-api-access-vn25h\") pod \"cert-manager-operator-controller-manager-66c8bdd694-ch4rs\" (UID: \"fbf4c349-bbda-4618-a260-2ffea91f2869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.706754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbf4c349-bbda-4618-a260-2ffea91f2869-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-ch4rs\" (UID: \"fbf4c349-bbda-4618-a260-2ffea91f2869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.707464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbf4c349-bbda-4618-a260-2ffea91f2869-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-ch4rs\" (UID: \"fbf4c349-bbda-4618-a260-2ffea91f2869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.725759 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn25h\" (UniqueName: \"kubernetes.io/projected/fbf4c349-bbda-4618-a260-2ffea91f2869-kube-api-access-vn25h\") pod \"cert-manager-operator-controller-manager-66c8bdd694-ch4rs\" (UID: \"fbf4c349-bbda-4618-a260-2ffea91f2869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:20 crc kubenswrapper[4749]: I0310 16:04:20.830034 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" Mar 10 16:04:21 crc kubenswrapper[4749]: I0310 16:04:21.224914 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs"] Mar 10 16:04:21 crc kubenswrapper[4749]: W0310 16:04:21.234903 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf4c349_bbda_4618_a260_2ffea91f2869.slice/crio-ac7d1b82efe2076e2c44ddaa984bdb2d1cc1c7341d16d35da4e32c2441585ce2 WatchSource:0}: Error finding container ac7d1b82efe2076e2c44ddaa984bdb2d1cc1c7341d16d35da4e32c2441585ce2: Status 404 returned error can't find the container with id ac7d1b82efe2076e2c44ddaa984bdb2d1cc1c7341d16d35da4e32c2441585ce2 Mar 10 16:04:21 crc kubenswrapper[4749]: I0310 16:04:21.659809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" event={"ID":"fbf4c349-bbda-4618-a260-2ffea91f2869","Type":"ContainerStarted","Data":"ac7d1b82efe2076e2c44ddaa984bdb2d1cc1c7341d16d35da4e32c2441585ce2"} Mar 10 16:04:21 crc kubenswrapper[4749]: I0310 16:04:21.661702 4749 generic.go:334] "Generic (PLEG): container finished" podID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerID="f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414" exitCode=0 Mar 10 16:04:21 crc kubenswrapper[4749]: I0310 16:04:21.661750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lt2k" event={"ID":"8fc8418d-5f19-4b91-b4f4-a71464be433a","Type":"ContainerDied","Data":"f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414"} Mar 10 16:04:22 crc kubenswrapper[4749]: I0310 16:04:22.676149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lt2k" event={"ID":"8fc8418d-5f19-4b91-b4f4-a71464be433a","Type":"ContainerStarted","Data":"5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007"} Mar 10 16:04:22 crc kubenswrapper[4749]: I0310 16:04:22.707063 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7lt2k" podStartSLOduration=2.089729048 podStartE2EDuration="4.707038448s" podCreationTimestamp="2026-03-10 16:04:18 +0000 UTC" firstStartedPulling="2026-03-10 16:04:19.64817619 +0000 UTC m=+956.770041887" lastFinishedPulling="2026-03-10 16:04:22.2654856 +0000 UTC m=+959.387351287" observedRunningTime="2026-03-10 16:04:22.70162165 +0000 UTC m=+959.823487347" watchObservedRunningTime="2026-03-10 16:04:22.707038448 +0000 UTC m=+959.828904135" Mar 10 16:04:24 crc kubenswrapper[4749]: I0310 16:04:24.399354 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:24 crc kubenswrapper[4749]: I0310 16:04:24.399729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:24 crc kubenswrapper[4749]: I0310 16:04:24.463259 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:24 crc kubenswrapper[4749]: I0310 16:04:24.724927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:25 crc kubenswrapper[4749]: I0310 16:04:25.698332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" event={"ID":"fbf4c349-bbda-4618-a260-2ffea91f2869","Type":"ContainerStarted","Data":"bf4ef052a13ca54f3384fc9ad636b669c54881e187c1b8ddf5efd7b98f107af1"} Mar 10 16:04:25 crc kubenswrapper[4749]: I0310 16:04:25.732629 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-ch4rs" podStartSLOduration=2.329152481 podStartE2EDuration="5.732607988s" podCreationTimestamp="2026-03-10 16:04:20 +0000 UTC" firstStartedPulling="2026-03-10 16:04:21.237091925 +0000 UTC m=+958.358957612" lastFinishedPulling="2026-03-10 16:04:24.640547432 +0000 UTC m=+961.762413119" observedRunningTime="2026-03-10 16:04:25.726835521 +0000 UTC m=+962.848701208" watchObservedRunningTime="2026-03-10 16:04:25.732607988 +0000 UTC m=+962.854473675" Mar 10 16:04:27 crc kubenswrapper[4749]: I0310 16:04:27.836277 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8s9z"] Mar 10 16:04:27 crc kubenswrapper[4749]: I0310 16:04:27.836910 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h8s9z" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="registry-server" containerID="cri-o://787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0" gracePeriod=2 Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.217483 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.312903 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lskkp\" (UniqueName: \"kubernetes.io/projected/b4e24437-6367-4603-b923-d97a4e2e737a-kube-api-access-lskkp\") pod \"b4e24437-6367-4603-b923-d97a4e2e737a\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.312993 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-utilities\") pod \"b4e24437-6367-4603-b923-d97a4e2e737a\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.313090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-catalog-content\") pod \"b4e24437-6367-4603-b923-d97a4e2e737a\" (UID: \"b4e24437-6367-4603-b923-d97a4e2e737a\") " Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.314185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-utilities" (OuterVolumeSpecName: "utilities") pod "b4e24437-6367-4603-b923-d97a4e2e737a" (UID: "b4e24437-6367-4603-b923-d97a4e2e737a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.319971 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e24437-6367-4603-b923-d97a4e2e737a-kube-api-access-lskkp" (OuterVolumeSpecName: "kube-api-access-lskkp") pod "b4e24437-6367-4603-b923-d97a4e2e737a" (UID: "b4e24437-6367-4603-b923-d97a4e2e737a"). InnerVolumeSpecName "kube-api-access-lskkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.369711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4e24437-6367-4603-b923-d97a4e2e737a" (UID: "b4e24437-6367-4603-b923-d97a4e2e737a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.415092 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.415151 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e24437-6367-4603-b923-d97a4e2e737a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.415170 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lskkp\" (UniqueName: \"kubernetes.io/projected/b4e24437-6367-4603-b923-d97a4e2e737a-kube-api-access-lskkp\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.573475 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.573592 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.614139 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.719417 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4e24437-6367-4603-b923-d97a4e2e737a" containerID="787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0" exitCode=0 Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.719491 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8s9z" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.719523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerDied","Data":"787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0"} Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.719558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8s9z" event={"ID":"b4e24437-6367-4603-b923-d97a4e2e737a","Type":"ContainerDied","Data":"ba3092f0e289ea70d7efb1901a3188d693a790153eb61e2cef1176c61e826ad3"} Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.719589 4749 scope.go:117] "RemoveContainer" containerID="787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.738300 4749 scope.go:117] "RemoveContainer" containerID="ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.762136 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8s9z"] Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.767034 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h8s9z"] Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.774173 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.775093 4749 scope.go:117] "RemoveContainer" containerID="09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.791353 4749 scope.go:117] "RemoveContainer" containerID="787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0" Mar 10 16:04:28 crc kubenswrapper[4749]: E0310 16:04:28.792215 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0\": container with ID starting with 787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0 not found: ID does not exist" containerID="787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.792262 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0"} err="failed to get container status \"787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0\": rpc error: code = NotFound desc = could not find container \"787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0\": container with ID starting with 787fad26cc0da877af8ef6db69aa323845ab2545486c0ca6eb3505d229c9aec0 not found: ID does not exist" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.792290 4749 scope.go:117] "RemoveContainer" containerID="ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea" Mar 10 16:04:28 crc kubenswrapper[4749]: E0310 16:04:28.792611 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea\": container with ID starting with ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea not found: ID does not exist" containerID="ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.792654 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea"} err="failed to get container status \"ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea\": rpc error: code = NotFound desc = could not find container \"ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea\": container with ID starting with ffba7bacda09b77e0560c0b8b3fa925ae3f8be187aa594416e7a5598c190e0ea not found: ID does not exist" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.792673 4749 scope.go:117] "RemoveContainer" containerID="09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d" Mar 10 16:04:28 crc kubenswrapper[4749]: E0310 16:04:28.792902 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d\": container with ID starting with 09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d not found: ID does not exist" containerID="09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d" Mar 10 16:04:28 crc kubenswrapper[4749]: I0310 16:04:28.792951 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d"} err="failed to get container status \"09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d\": rpc error: code = NotFound desc = could not find container \"09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d\": container with ID starting with 09885556dbbe16a808eac673ee8f4d0d7e5d8d12f62148a124a36353a779108d not found: ID does not exist" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.545104 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n2h5b"] Mar 10 16:04:29 crc kubenswrapper[4749]: E0310 16:04:29.545549 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="extract-utilities" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.545883 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="extract-utilities" Mar 10 16:04:29 crc kubenswrapper[4749]: E0310 16:04:29.545893 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="extract-content" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.545901 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="extract-content" Mar 10 16:04:29 crc kubenswrapper[4749]: E0310 16:04:29.545919 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="registry-server" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.545925 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="registry-server" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.546024 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" containerName="registry-server" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.546427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.548649 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.548807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-txxn8" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.550559 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.559250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n2h5b"] Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.615749 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e24437-6367-4603-b923-d97a4e2e737a" path="/var/lib/kubelet/pods/b4e24437-6367-4603-b923-d97a4e2e737a/volumes" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.630978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n2h5b\" (UID: \"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.631036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krhwl\" (UniqueName: \"kubernetes.io/projected/787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e-kube-api-access-krhwl\") pod \"cert-manager-cainjector-5545bd876-n2h5b\" (UID: \"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.731892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krhwl\" (UniqueName: \"kubernetes.io/projected/787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e-kube-api-access-krhwl\") pod \"cert-manager-cainjector-5545bd876-n2h5b\" (UID: \"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.732028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n2h5b\" (UID: \"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.751046 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krhwl\" (UniqueName: \"kubernetes.io/projected/787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e-kube-api-access-krhwl\") pod \"cert-manager-cainjector-5545bd876-n2h5b\" (UID: \"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.755967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n2h5b\" (UID: \"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:29 crc kubenswrapper[4749]: I0310 16:04:29.864632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" Mar 10 16:04:30 crc kubenswrapper[4749]: I0310 16:04:30.068543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n2h5b"] Mar 10 16:04:30 crc kubenswrapper[4749]: I0310 16:04:30.735923 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" event={"ID":"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e","Type":"ContainerStarted","Data":"eb80a9f87c23c006a651b5ec019dda895cfc40d21cdf89dd04e2ef034d4f558c"} Mar 10 16:04:30 crc kubenswrapper[4749]: I0310 16:04:30.926682 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-5d2nd"] Mar 10 16:04:30 crc kubenswrapper[4749]: I0310 16:04:30.927831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:30 crc kubenswrapper[4749]: I0310 16:04:30.931088 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dfskf" Mar 10 16:04:30 crc kubenswrapper[4749]: I0310 16:04:30.935961 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-5d2nd"] Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.056135 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9jg\" (UniqueName: \"kubernetes.io/projected/882b2688-d221-4a4c-8771-0df154029fcb-kube-api-access-lv9jg\") pod \"cert-manager-webhook-6888856db4-5d2nd\" (UID: \"882b2688-d221-4a4c-8771-0df154029fcb\") " pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.056209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/882b2688-d221-4a4c-8771-0df154029fcb-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-5d2nd\" (UID: \"882b2688-d221-4a4c-8771-0df154029fcb\") " pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.157820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9jg\" (UniqueName: \"kubernetes.io/projected/882b2688-d221-4a4c-8771-0df154029fcb-kube-api-access-lv9jg\") pod \"cert-manager-webhook-6888856db4-5d2nd\" (UID: \"882b2688-d221-4a4c-8771-0df154029fcb\") " pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.158287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/882b2688-d221-4a4c-8771-0df154029fcb-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-5d2nd\" (UID: \"882b2688-d221-4a4c-8771-0df154029fcb\") " pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.181928 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/882b2688-d221-4a4c-8771-0df154029fcb-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-5d2nd\" (UID: \"882b2688-d221-4a4c-8771-0df154029fcb\") " pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.182577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9jg\" (UniqueName: \"kubernetes.io/projected/882b2688-d221-4a4c-8771-0df154029fcb-kube-api-access-lv9jg\") pod \"cert-manager-webhook-6888856db4-5d2nd\" (UID: \"882b2688-d221-4a4c-8771-0df154029fcb\") " pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.259772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.693997 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-5d2nd"] Mar 10 16:04:31 crc kubenswrapper[4749]: W0310 16:04:31.707199 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882b2688_d221_4a4c_8771_0df154029fcb.slice/crio-5a31813e538361ff5d76c57554b0203ef55519f2a3afed81600c4781fd8bac5f WatchSource:0}: Error finding container 5a31813e538361ff5d76c57554b0203ef55519f2a3afed81600c4781fd8bac5f: Status 404 returned error can't find the container with id 5a31813e538361ff5d76c57554b0203ef55519f2a3afed81600c4781fd8bac5f Mar 10 16:04:31 crc kubenswrapper[4749]: I0310 16:04:31.744172 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" event={"ID":"882b2688-d221-4a4c-8771-0df154029fcb","Type":"ContainerStarted","Data":"5a31813e538361ff5d76c57554b0203ef55519f2a3afed81600c4781fd8bac5f"} Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.038257 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lt2k"] Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.038620 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7lt2k" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="registry-server" containerID="cri-o://5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007" gracePeriod=2 Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.540079 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.587135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjknr\" (UniqueName: \"kubernetes.io/projected/8fc8418d-5f19-4b91-b4f4-a71464be433a-kube-api-access-zjknr\") pod \"8fc8418d-5f19-4b91-b4f4-a71464be433a\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.587316 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-utilities\") pod \"8fc8418d-5f19-4b91-b4f4-a71464be433a\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.587379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-catalog-content\") pod \"8fc8418d-5f19-4b91-b4f4-a71464be433a\" (UID: \"8fc8418d-5f19-4b91-b4f4-a71464be433a\") " Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.591638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-utilities" (OuterVolumeSpecName: "utilities") pod "8fc8418d-5f19-4b91-b4f4-a71464be433a" (UID: "8fc8418d-5f19-4b91-b4f4-a71464be433a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.597076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc8418d-5f19-4b91-b4f4-a71464be433a-kube-api-access-zjknr" (OuterVolumeSpecName: "kube-api-access-zjknr") pod "8fc8418d-5f19-4b91-b4f4-a71464be433a" (UID: "8fc8418d-5f19-4b91-b4f4-a71464be433a"). InnerVolumeSpecName "kube-api-access-zjknr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.655009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fc8418d-5f19-4b91-b4f4-a71464be433a" (UID: "8fc8418d-5f19-4b91-b4f4-a71464be433a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.689099 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.689136 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fc8418d-5f19-4b91-b4f4-a71464be433a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.689148 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjknr\" (UniqueName: \"kubernetes.io/projected/8fc8418d-5f19-4b91-b4f4-a71464be433a-kube-api-access-zjknr\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.798535 4749 generic.go:334] "Generic (PLEG): container finished" podID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerID="5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007" exitCode=0 Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.798584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lt2k" event={"ID":"8fc8418d-5f19-4b91-b4f4-a71464be433a","Type":"ContainerDied","Data":"5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007"} Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.798615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lt2k" event={"ID":"8fc8418d-5f19-4b91-b4f4-a71464be433a","Type":"ContainerDied","Data":"9dae14aa5a7ac3052ddba8241949e12fe7dc6e2588e5e23d78978b6316ef991c"} Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.798634 4749 scope.go:117] "RemoveContainer" containerID="5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.798772 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lt2k" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.845864 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lt2k"] Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.855935 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7lt2k"] Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.873572 4749 scope.go:117] "RemoveContainer" containerID="f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.920498 4749 scope.go:117] "RemoveContainer" containerID="a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.946567 4749 scope.go:117] "RemoveContainer" containerID="5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007" Mar 10 16:04:32 crc kubenswrapper[4749]: E0310 16:04:32.947050 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007\": container with ID starting with 5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007 not found: ID does not exist" containerID="5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.947079 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007"} err="failed to get container status \"5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007\": rpc error: code = NotFound desc = could not find container \"5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007\": container with ID starting with 5fbc6651ba154ee3712e8e0b2a88638a2e4ba2ea0e2aba927ba9f54951940007 not found: ID does not exist" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.947103 4749 scope.go:117] "RemoveContainer" containerID="f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414" Mar 10 16:04:32 crc kubenswrapper[4749]: E0310 16:04:32.947536 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414\": container with ID starting with f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414 not found: ID does not exist" containerID="f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.947577 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414"} err="failed to get container status \"f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414\": rpc error: code = NotFound desc = could not find container \"f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414\": container with ID starting with f9d8ee990608eb24b4f9a56e3287b68dbd1cd81d9dfebcd72f729fd9d09b5414 not found: ID does not exist" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.947607 4749 scope.go:117] "RemoveContainer" containerID="a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8" Mar 10 16:04:32 crc kubenswrapper[4749]: E0310 16:04:32.948056 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8\": container with ID starting with a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8 not found: ID does not exist" containerID="a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8" Mar 10 16:04:32 crc kubenswrapper[4749]: I0310 16:04:32.948083 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8"} err="failed to get container status \"a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8\": rpc error: code = NotFound desc = could not find container \"a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8\": container with ID starting with a6f406a2279fd6975bfeac1ef616091858830abf7a8a4083afe016b4d2d64af8 not found: ID does not exist" Mar 10 16:04:33 crc kubenswrapper[4749]: I0310 16:04:33.630887 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" path="/var/lib/kubelet/pods/8fc8418d-5f19-4b91-b4f4-a71464be433a/volumes" Mar 10 16:04:35 crc kubenswrapper[4749]: I0310 16:04:35.817764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" event={"ID":"787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e","Type":"ContainerStarted","Data":"ba6f46028e3b364f5dc637dc5e3d021d07a1a50dd4f481680ff24486f807efb5"} Mar 10 16:04:35 crc kubenswrapper[4749]: I0310 16:04:35.819171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" event={"ID":"882b2688-d221-4a4c-8771-0df154029fcb","Type":"ContainerStarted","Data":"b359a30ab98e65b03d3171a76f69d9c42dc0012a66469bb269ccd7594b8a7b05"} Mar 10 16:04:35 crc kubenswrapper[4749]: I0310 16:04:35.819362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:35 crc kubenswrapper[4749]: I0310 16:04:35.840710 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-n2h5b" podStartSLOduration=2.250460333 podStartE2EDuration="6.840679908s" podCreationTimestamp="2026-03-10 16:04:29 +0000 UTC" firstStartedPulling="2026-03-10 16:04:30.076130077 +0000 UTC m=+967.197995764" lastFinishedPulling="2026-03-10 16:04:34.666349652 +0000 UTC m=+971.788215339" observedRunningTime="2026-03-10 16:04:35.832370018 +0000 UTC m=+972.954235695" watchObservedRunningTime="2026-03-10 16:04:35.840679908 +0000 UTC m=+972.962545635" Mar 10 16:04:35 crc kubenswrapper[4749]: I0310 16:04:35.850220 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" podStartSLOduration=2.876275662 podStartE2EDuration="5.850202132s" podCreationTimestamp="2026-03-10 16:04:30 +0000 UTC" firstStartedPulling="2026-03-10 16:04:31.716747165 +0000 UTC m=+968.838612862" lastFinishedPulling="2026-03-10 16:04:34.690673645 +0000 UTC m=+971.812539332" observedRunningTime="2026-03-10 16:04:35.847521768 +0000 UTC m=+972.969387455" watchObservedRunningTime="2026-03-10 16:04:35.850202132 +0000 UTC m=+972.972067849" Mar 10 16:04:41 crc kubenswrapper[4749]: I0310 16:04:41.262433 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-5d2nd" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.895176 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9ztgr"] Mar 10 16:04:46 crc kubenswrapper[4749]: E0310 16:04:46.895697 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="extract-content" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.895710 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="extract-content" Mar 10 16:04:46 crc kubenswrapper[4749]: E0310 16:04:46.895730 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="extract-utilities" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.895736 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="extract-utilities" Mar 10 16:04:46 crc kubenswrapper[4749]: E0310 16:04:46.895743 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="registry-server" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.895750 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="registry-server" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.895843 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc8418d-5f19-4b91-b4f4-a71464be433a" containerName="registry-server" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.896265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.906825 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7t657" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.908586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9ztgr"] Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.979505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72e98565-a830-4f8e-a99b-8430585e2763-bound-sa-token\") pod \"cert-manager-545d4d4674-9ztgr\" (UID: \"72e98565-a830-4f8e-a99b-8430585e2763\") " pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:46 crc kubenswrapper[4749]: I0310 16:04:46.979584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpjqz\" (UniqueName: \"kubernetes.io/projected/72e98565-a830-4f8e-a99b-8430585e2763-kube-api-access-mpjqz\") pod \"cert-manager-545d4d4674-9ztgr\" (UID: \"72e98565-a830-4f8e-a99b-8430585e2763\") " pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.081211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72e98565-a830-4f8e-a99b-8430585e2763-bound-sa-token\") pod \"cert-manager-545d4d4674-9ztgr\" (UID: \"72e98565-a830-4f8e-a99b-8430585e2763\") " pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.081303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpjqz\" (UniqueName: \"kubernetes.io/projected/72e98565-a830-4f8e-a99b-8430585e2763-kube-api-access-mpjqz\") pod \"cert-manager-545d4d4674-9ztgr\" (UID: \"72e98565-a830-4f8e-a99b-8430585e2763\") " pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.103473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72e98565-a830-4f8e-a99b-8430585e2763-bound-sa-token\") pod \"cert-manager-545d4d4674-9ztgr\" (UID: \"72e98565-a830-4f8e-a99b-8430585e2763\") " pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.103590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpjqz\" (UniqueName: \"kubernetes.io/projected/72e98565-a830-4f8e-a99b-8430585e2763-kube-api-access-mpjqz\") pod \"cert-manager-545d4d4674-9ztgr\" (UID: \"72e98565-a830-4f8e-a99b-8430585e2763\") " pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.221295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9ztgr" Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.678584 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9ztgr"] Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.915325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9ztgr" event={"ID":"72e98565-a830-4f8e-a99b-8430585e2763","Type":"ContainerStarted","Data":"fdd57bece9a65a16eeafc0cf2753fccb00e8cec544f165d0db3a6e112b0aa1a7"} Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.915394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9ztgr" event={"ID":"72e98565-a830-4f8e-a99b-8430585e2763","Type":"ContainerStarted","Data":"6fc352e1c3723b9e9e890ebef85039393daa5202f644620d1b83c40646dbe2e9"} Mar 10 16:04:47 crc kubenswrapper[4749]: I0310 16:04:47.933001 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9ztgr" podStartSLOduration=1.932983615 podStartE2EDuration="1.932983615s" podCreationTimestamp="2026-03-10 16:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:04:47.932649936 +0000 UTC m=+985.054515623" watchObservedRunningTime="2026-03-10 16:04:47.932983615 +0000 UTC m=+985.054849302" Mar 10 16:04:53 crc kubenswrapper[4749]: I0310 16:04:53.209752 4749 scope.go:117] "RemoveContainer" containerID="5db944d8e89880ee300a5fe1c79a6c36b1c6f7337b4fa429574aa3e51954c23b" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.466677 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jr927"] Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.467900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.470361 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.470668 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.470935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-n75z7" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.474648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6msx\" (UniqueName: \"kubernetes.io/projected/5461cd34-ed04-40a8-9e4e-d230de836d00-kube-api-access-p6msx\") pod \"openstack-operator-index-jr927\" (UID: \"5461cd34-ed04-40a8-9e4e-d230de836d00\") " pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.529005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jr927"] Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.575912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6msx\" (UniqueName: \"kubernetes.io/projected/5461cd34-ed04-40a8-9e4e-d230de836d00-kube-api-access-p6msx\") pod \"openstack-operator-index-jr927\" (UID: \"5461cd34-ed04-40a8-9e4e-d230de836d00\") " pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.593715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6msx\" (UniqueName: \"kubernetes.io/projected/5461cd34-ed04-40a8-9e4e-d230de836d00-kube-api-access-p6msx\") pod \"openstack-operator-index-jr927\" (UID: \"5461cd34-ed04-40a8-9e4e-d230de836d00\") " pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.796281 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:54 crc kubenswrapper[4749]: I0310 16:04:54.997735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jr927"] Mar 10 16:04:55 crc kubenswrapper[4749]: I0310 16:04:55.975329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jr927" event={"ID":"5461cd34-ed04-40a8-9e4e-d230de836d00","Type":"ContainerStarted","Data":"cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705"} Mar 10 16:04:55 crc kubenswrapper[4749]: I0310 16:04:55.976061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jr927" event={"ID":"5461cd34-ed04-40a8-9e4e-d230de836d00","Type":"ContainerStarted","Data":"ef2d0d06bce7cd329d408d3d8fbc06baf3ab6f3ed6a8133083ff27d785a24716"} Mar 10 16:04:55 crc kubenswrapper[4749]: I0310 16:04:55.992094 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jr927" podStartSLOduration=1.344275709 podStartE2EDuration="1.992068179s" podCreationTimestamp="2026-03-10 16:04:54 +0000 UTC" firstStartedPulling="2026-03-10 16:04:55.007278145 +0000 UTC m=+992.129143842" lastFinishedPulling="2026-03-10 16:04:55.655070625 +0000 UTC m=+992.776936312" observedRunningTime="2026-03-10 16:04:55.988934792 +0000 UTC m=+993.110800529" watchObservedRunningTime="2026-03-10 16:04:55.992068179 +0000 UTC m=+993.113933916" Mar 10 16:04:57 crc kubenswrapper[4749]: I0310 16:04:57.636323 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jr927"] Mar 10 16:04:57 crc kubenswrapper[4749]: I0310 16:04:57.989017 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jr927" podUID="5461cd34-ed04-40a8-9e4e-d230de836d00" containerName="registry-server" containerID="cri-o://cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705" gracePeriod=2 Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.259634 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qj8sc"] Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.260478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qj8sc"] Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.260587 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.376031 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.425508 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6msx\" (UniqueName: \"kubernetes.io/projected/5461cd34-ed04-40a8-9e4e-d230de836d00-kube-api-access-p6msx\") pod \"5461cd34-ed04-40a8-9e4e-d230de836d00\" (UID: \"5461cd34-ed04-40a8-9e4e-d230de836d00\") " Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.425661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9kjq\" (UniqueName: \"kubernetes.io/projected/a1314c71-434d-4aeb-8268-97011361d024-kube-api-access-b9kjq\") pod \"openstack-operator-index-qj8sc\" (UID: \"a1314c71-434d-4aeb-8268-97011361d024\") " pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.432123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5461cd34-ed04-40a8-9e4e-d230de836d00-kube-api-access-p6msx" (OuterVolumeSpecName: "kube-api-access-p6msx") pod "5461cd34-ed04-40a8-9e4e-d230de836d00" (UID: "5461cd34-ed04-40a8-9e4e-d230de836d00"). InnerVolumeSpecName "kube-api-access-p6msx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.526983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9kjq\" (UniqueName: \"kubernetes.io/projected/a1314c71-434d-4aeb-8268-97011361d024-kube-api-access-b9kjq\") pod \"openstack-operator-index-qj8sc\" (UID: \"a1314c71-434d-4aeb-8268-97011361d024\") " pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.527159 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6msx\" (UniqueName: \"kubernetes.io/projected/5461cd34-ed04-40a8-9e4e-d230de836d00-kube-api-access-p6msx\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.543164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9kjq\" (UniqueName: \"kubernetes.io/projected/a1314c71-434d-4aeb-8268-97011361d024-kube-api-access-b9kjq\") pod \"openstack-operator-index-qj8sc\" (UID: \"a1314c71-434d-4aeb-8268-97011361d024\") " pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.578765 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:04:58 crc kubenswrapper[4749]: I0310 16:04:58.781870 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qj8sc"] Mar 10 16:04:58 crc kubenswrapper[4749]: W0310 16:04:58.792319 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1314c71_434d_4aeb_8268_97011361d024.slice/crio-895d23006b194cc4e2911d634531447a2d7448b28ed0f4009aa6dfe3313466f7 WatchSource:0}: Error finding container 895d23006b194cc4e2911d634531447a2d7448b28ed0f4009aa6dfe3313466f7: Status 404 returned error can't find the container with id 895d23006b194cc4e2911d634531447a2d7448b28ed0f4009aa6dfe3313466f7 Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:58.999919 4749 generic.go:334] "Generic (PLEG): container finished" podID="5461cd34-ed04-40a8-9e4e-d230de836d00" containerID="cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705" exitCode=0 Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:58.999997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jr927" Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.000022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jr927" event={"ID":"5461cd34-ed04-40a8-9e4e-d230de836d00","Type":"ContainerDied","Data":"cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705"} Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.000060 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jr927" event={"ID":"5461cd34-ed04-40a8-9e4e-d230de836d00","Type":"ContainerDied","Data":"ef2d0d06bce7cd329d408d3d8fbc06baf3ab6f3ed6a8133083ff27d785a24716"} Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.000085 4749 scope.go:117] "RemoveContainer" containerID="cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705" Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.002077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qj8sc" event={"ID":"a1314c71-434d-4aeb-8268-97011361d024","Type":"ContainerStarted","Data":"895d23006b194cc4e2911d634531447a2d7448b28ed0f4009aa6dfe3313466f7"} Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.023613 4749 scope.go:117] "RemoveContainer" containerID="cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705" Mar 10 16:04:59 crc kubenswrapper[4749]: E0310 16:04:59.024092 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705\": container with ID starting with cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705 not found: ID does not exist" containerID="cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705" Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.024148 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705"} err="failed to get container status \"cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705\": rpc error: code = NotFound desc = could not find container \"cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705\": container with ID starting with cb3becf4194ac9312ee704fcff101ee020d3bcffae6fda680d67a3368c954705 not found: ID does not exist" Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.035784 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jr927"] Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.039579 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jr927"] Mar 10 16:04:59 crc kubenswrapper[4749]: I0310 16:04:59.615271 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5461cd34-ed04-40a8-9e4e-d230de836d00" path="/var/lib/kubelet/pods/5461cd34-ed04-40a8-9e4e-d230de836d00/volumes" Mar 10 16:05:00 crc kubenswrapper[4749]: I0310 16:05:00.011128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qj8sc" event={"ID":"a1314c71-434d-4aeb-8268-97011361d024","Type":"ContainerStarted","Data":"04164ff91970c760191f0be0e794f069caff9dee9ff28fe3b6a4170b071aa2d6"} Mar 10 16:05:08 crc kubenswrapper[4749]: I0310 16:05:08.578947 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:05:08 crc kubenswrapper[4749]: I0310 16:05:08.579669 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:05:08 crc kubenswrapper[4749]: I0310 16:05:08.613148 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:05:08 crc kubenswrapper[4749]: I0310 16:05:08.631078 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qj8sc" podStartSLOduration=10.026742482 podStartE2EDuration="10.631055789s" podCreationTimestamp="2026-03-10 16:04:58 +0000 UTC" firstStartedPulling="2026-03-10 16:04:58.796440558 +0000 UTC m=+995.918306245" lastFinishedPulling="2026-03-10 16:04:59.400753855 +0000 UTC m=+996.522619552" observedRunningTime="2026-03-10 16:05:00.029335074 +0000 UTC m=+997.151200781" watchObservedRunningTime="2026-03-10 16:05:08.631055789 +0000 UTC m=+1005.752921476" Mar 10 16:05:09 crc kubenswrapper[4749]: I0310 16:05:09.223251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qj8sc" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.282878 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r"] Mar 10 16:05:10 crc kubenswrapper[4749]: E0310 16:05:10.283738 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5461cd34-ed04-40a8-9e4e-d230de836d00" containerName="registry-server" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.283759 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5461cd34-ed04-40a8-9e4e-d230de836d00" containerName="registry-server" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.283939 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5461cd34-ed04-40a8-9e4e-d230de836d00" containerName="registry-server" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.285105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.288292 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9sztz" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.294152 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r"] Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.307066 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqj2\" (UniqueName: \"kubernetes.io/projected/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-kube-api-access-qpqj2\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.307117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.307200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.409358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqj2\" (UniqueName: \"kubernetes.io/projected/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-kube-api-access-qpqj2\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.409518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.409740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.410432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.410557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.430555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqj2\" (UniqueName: \"kubernetes.io/projected/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-kube-api-access-qpqj2\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:10 crc kubenswrapper[4749]: I0310 16:05:10.610467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.044435 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r"] Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.049349 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bphcx"] Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.050927 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.069680 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bphcx"] Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.117449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-utilities\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.117658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qckb\" (UniqueName: \"kubernetes.io/projected/3a8eedaf-b39b-441a-b964-b33d943c8ae4-kube-api-access-7qckb\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.117736 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-catalog-content\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.205502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" event={"ID":"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f","Type":"ContainerStarted","Data":"fbe27ac3e37dc5fe2ebda89281f88120c32b0b8351e548a6b6b71e7dc2093b04"} Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.218683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qckb\" (UniqueName: \"kubernetes.io/projected/3a8eedaf-b39b-441a-b964-b33d943c8ae4-kube-api-access-7qckb\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.218722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-catalog-content\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.218780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-utilities\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.219239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-utilities\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.219331 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-catalog-content\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.244316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qckb\" (UniqueName: \"kubernetes.io/projected/3a8eedaf-b39b-441a-b964-b33d943c8ae4-kube-api-access-7qckb\") pod \"redhat-marketplace-bphcx\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.399209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:11 crc kubenswrapper[4749]: I0310 16:05:11.804767 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bphcx"] Mar 10 16:05:12 crc kubenswrapper[4749]: I0310 16:05:12.216334 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerID="264f60d0bb652b29e124b1eb8b7c16656f053670950180266abcc0f56a9633d3" exitCode=0 Mar 10 16:05:12 crc kubenswrapper[4749]: I0310 16:05:12.216463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bphcx" event={"ID":"3a8eedaf-b39b-441a-b964-b33d943c8ae4","Type":"ContainerDied","Data":"264f60d0bb652b29e124b1eb8b7c16656f053670950180266abcc0f56a9633d3"} Mar 10 16:05:12 crc kubenswrapper[4749]: I0310 16:05:12.216731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bphcx" event={"ID":"3a8eedaf-b39b-441a-b964-b33d943c8ae4","Type":"ContainerStarted","Data":"5e45e5c71ae2486138fe0278ad8828b5bde6ec5bdf0593c272e20ce10c2ffc7f"} Mar 10 16:05:12 crc kubenswrapper[4749]: I0310 16:05:12.220689 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerID="0194746f8cddef5fbfc2d4c1d4be05b75163c8674462147c5a14b8a414f53b32" exitCode=0 Mar 10 16:05:12 crc kubenswrapper[4749]: I0310 16:05:12.220750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" event={"ID":"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f","Type":"ContainerDied","Data":"0194746f8cddef5fbfc2d4c1d4be05b75163c8674462147c5a14b8a414f53b32"} Mar 10 16:05:13 crc kubenswrapper[4749]: I0310 16:05:13.235788 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerID="c8b32e84630336322451fd6552fe17d8fea6299359a7ac4111ac89a02bcf0f5a" exitCode=0 Mar 10 16:05:13 crc kubenswrapper[4749]: I0310 16:05:13.235856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" event={"ID":"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f","Type":"ContainerDied","Data":"c8b32e84630336322451fd6552fe17d8fea6299359a7ac4111ac89a02bcf0f5a"} Mar 10 16:05:14 crc kubenswrapper[4749]: I0310 16:05:14.245217 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerID="315761846d54765f28963862b4faffbe3450a457306158f64f2b68b9360ea0b5" exitCode=0 Mar 10 16:05:14 crc kubenswrapper[4749]: I0310 16:05:14.245288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bphcx" event={"ID":"3a8eedaf-b39b-441a-b964-b33d943c8ae4","Type":"ContainerDied","Data":"315761846d54765f28963862b4faffbe3450a457306158f64f2b68b9360ea0b5"} Mar 10 16:05:14 crc kubenswrapper[4749]: I0310 16:05:14.248693 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerID="b59bb718865c9604367e35106d23d6451883f45ac31d21b2855f4da6b7f7626d" exitCode=0 Mar 10 16:05:14 crc kubenswrapper[4749]: I0310 16:05:14.248737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" event={"ID":"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f","Type":"ContainerDied","Data":"b59bb718865c9604367e35106d23d6451883f45ac31d21b2855f4da6b7f7626d"} Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.260480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bphcx" event={"ID":"3a8eedaf-b39b-441a-b964-b33d943c8ae4","Type":"ContainerStarted","Data":"295934f9bec77aa94bffeab8460d38caad9702d46c0449446b3d34f36bcbfb47"} Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.284884 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bphcx" podStartSLOduration=1.704553438 podStartE2EDuration="4.284867879s" podCreationTimestamp="2026-03-10 16:05:11 +0000 UTC" firstStartedPulling="2026-03-10 16:05:12.218211874 +0000 UTC m=+1009.340077561" lastFinishedPulling="2026-03-10 16:05:14.798526284 +0000 UTC m=+1011.920392002" observedRunningTime="2026-03-10 16:05:15.281068743 +0000 UTC m=+1012.402934440" watchObservedRunningTime="2026-03-10 16:05:15.284867879 +0000 UTC m=+1012.406733566" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.520977 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.578607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpqj2\" (UniqueName: \"kubernetes.io/projected/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-kube-api-access-qpqj2\") pod \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.578741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-bundle\") pod \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.578771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-util\") pod \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\" (UID: \"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f\") " Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.580207 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-bundle" (OuterVolumeSpecName: "bundle") pod "a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" (UID: "a0aa2d7c-e6aa-427d-99ff-6fb5b258659f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.585167 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-kube-api-access-qpqj2" (OuterVolumeSpecName: "kube-api-access-qpqj2") pod "a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" (UID: "a0aa2d7c-e6aa-427d-99ff-6fb5b258659f"). InnerVolumeSpecName "kube-api-access-qpqj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.596010 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-util" (OuterVolumeSpecName: "util") pod "a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" (UID: "a0aa2d7c-e6aa-427d-99ff-6fb5b258659f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.680916 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpqj2\" (UniqueName: \"kubernetes.io/projected/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-kube-api-access-qpqj2\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.680977 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:15 crc kubenswrapper[4749]: I0310 16:05:15.680991 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0aa2d7c-e6aa-427d-99ff-6fb5b258659f-util\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:15 crc kubenswrapper[4749]: E0310 16:05:15.769774 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0aa2d7c_e6aa_427d_99ff_6fb5b258659f.slice/crio-fbe27ac3e37dc5fe2ebda89281f88120c32b0b8351e548a6b6b71e7dc2093b04\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0aa2d7c_e6aa_427d_99ff_6fb5b258659f.slice\": RecentStats: unable to find data in memory cache]" Mar 10 16:05:16 crc kubenswrapper[4749]: I0310 16:05:16.269409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" event={"ID":"a0aa2d7c-e6aa-427d-99ff-6fb5b258659f","Type":"ContainerDied","Data":"fbe27ac3e37dc5fe2ebda89281f88120c32b0b8351e548a6b6b71e7dc2093b04"} Mar 10 16:05:16 crc kubenswrapper[4749]: I0310 16:05:16.269494 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe27ac3e37dc5fe2ebda89281f88120c32b0b8351e548a6b6b71e7dc2093b04" Mar 10 16:05:16 crc kubenswrapper[4749]: I0310 16:05:16.269446 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.409408 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4"] Mar 10 16:05:20 crc kubenswrapper[4749]: E0310 16:05:20.409869 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="extract" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.409881 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="extract" Mar 10 16:05:20 crc kubenswrapper[4749]: E0310 16:05:20.409898 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="util" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.409904 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="util" Mar 10 16:05:20 crc kubenswrapper[4749]: E0310 16:05:20.409923 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="pull" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.409929 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="pull" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.410022 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0aa2d7c-e6aa-427d-99ff-6fb5b258659f" containerName="extract" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.410396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.412487 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-4pqpm" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.432652 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4"] Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.451954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hxmx\" (UniqueName: \"kubernetes.io/projected/97f36cd6-28eb-4a46-928c-0a1ea78da590-kube-api-access-7hxmx\") pod \"openstack-operator-controller-init-6cf8df7788-6rcf4\" (UID: \"97f36cd6-28eb-4a46-928c-0a1ea78da590\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.554164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hxmx\" (UniqueName: \"kubernetes.io/projected/97f36cd6-28eb-4a46-928c-0a1ea78da590-kube-api-access-7hxmx\") pod \"openstack-operator-controller-init-6cf8df7788-6rcf4\" (UID: \"97f36cd6-28eb-4a46-928c-0a1ea78da590\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.578099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hxmx\" (UniqueName: \"kubernetes.io/projected/97f36cd6-28eb-4a46-928c-0a1ea78da590-kube-api-access-7hxmx\") pod \"openstack-operator-controller-init-6cf8df7788-6rcf4\" (UID: \"97f36cd6-28eb-4a46-928c-0a1ea78da590\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:20 crc kubenswrapper[4749]: I0310 16:05:20.725451 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:21 crc kubenswrapper[4749]: I0310 16:05:21.217876 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4"] Mar 10 16:05:21 crc kubenswrapper[4749]: I0310 16:05:21.309868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" event={"ID":"97f36cd6-28eb-4a46-928c-0a1ea78da590","Type":"ContainerStarted","Data":"627bce3706c66059e6b84102e45c3173adfae9c0cc6b2b849bbbfc2ba85f1cba"} Mar 10 16:05:21 crc kubenswrapper[4749]: I0310 16:05:21.400196 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:21 crc kubenswrapper[4749]: I0310 16:05:21.400242 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:21 crc kubenswrapper[4749]: I0310 16:05:21.443997 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:22 crc kubenswrapper[4749]: I0310 16:05:22.356978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:23 crc kubenswrapper[4749]: I0310 16:05:23.235166 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bphcx"] Mar 10 16:05:24 crc kubenswrapper[4749]: I0310 16:05:24.342803 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bphcx" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="registry-server" containerID="cri-o://295934f9bec77aa94bffeab8460d38caad9702d46c0449446b3d34f36bcbfb47" gracePeriod=2 Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.354086 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerID="295934f9bec77aa94bffeab8460d38caad9702d46c0449446b3d34f36bcbfb47" exitCode=0 Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.354187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bphcx" event={"ID":"3a8eedaf-b39b-441a-b964-b33d943c8ae4","Type":"ContainerDied","Data":"295934f9bec77aa94bffeab8460d38caad9702d46c0449446b3d34f36bcbfb47"} Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.702365 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.830528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-catalog-content\") pod \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.830592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qckb\" (UniqueName: \"kubernetes.io/projected/3a8eedaf-b39b-441a-b964-b33d943c8ae4-kube-api-access-7qckb\") pod \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.830628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-utilities\") pod \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\" (UID: \"3a8eedaf-b39b-441a-b964-b33d943c8ae4\") " Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.831788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-utilities" (OuterVolumeSpecName: "utilities") pod "3a8eedaf-b39b-441a-b964-b33d943c8ae4" (UID: "3a8eedaf-b39b-441a-b964-b33d943c8ae4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.841454 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a8eedaf-b39b-441a-b964-b33d943c8ae4-kube-api-access-7qckb" (OuterVolumeSpecName: "kube-api-access-7qckb") pod "3a8eedaf-b39b-441a-b964-b33d943c8ae4" (UID: "3a8eedaf-b39b-441a-b964-b33d943c8ae4"). InnerVolumeSpecName "kube-api-access-7qckb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.860799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a8eedaf-b39b-441a-b964-b33d943c8ae4" (UID: "3a8eedaf-b39b-441a-b964-b33d943c8ae4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.931911 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.931945 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qckb\" (UniqueName: \"kubernetes.io/projected/3a8eedaf-b39b-441a-b964-b33d943c8ae4-kube-api-access-7qckb\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:25 crc kubenswrapper[4749]: I0310 16:05:25.931957 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8eedaf-b39b-441a-b964-b33d943c8ae4-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.360993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" event={"ID":"97f36cd6-28eb-4a46-928c-0a1ea78da590","Type":"ContainerStarted","Data":"f2252338b43ae457b3ce7adbc8e17639dd8aaa1773dd5f2bd9b4e95e16450b60"} Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.361116 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.363271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bphcx" event={"ID":"3a8eedaf-b39b-441a-b964-b33d943c8ae4","Type":"ContainerDied","Data":"5e45e5c71ae2486138fe0278ad8828b5bde6ec5bdf0593c272e20ce10c2ffc7f"} Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.363778 4749 scope.go:117] "RemoveContainer" containerID="295934f9bec77aa94bffeab8460d38caad9702d46c0449446b3d34f36bcbfb47" Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.363338 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bphcx" Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.398967 4749 scope.go:117] "RemoveContainer" containerID="315761846d54765f28963862b4faffbe3450a457306158f64f2b68b9360ea0b5" Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.400511 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" podStartSLOduration=2.103230119 podStartE2EDuration="6.400040174s" podCreationTimestamp="2026-03-10 16:05:20 +0000 UTC" firstStartedPulling="2026-03-10 16:05:21.230966349 +0000 UTC m=+1018.352832036" lastFinishedPulling="2026-03-10 16:05:25.527776414 +0000 UTC m=+1022.649642091" observedRunningTime="2026-03-10 16:05:26.396743873 +0000 UTC m=+1023.518609570" watchObservedRunningTime="2026-03-10 16:05:26.400040174 +0000 UTC m=+1023.521905861" Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.417114 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bphcx"] Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.424674 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bphcx"] Mar 10 16:05:26 crc kubenswrapper[4749]: I0310 16:05:26.427361 4749 scope.go:117] "RemoveContainer" containerID="264f60d0bb652b29e124b1eb8b7c16656f053670950180266abcc0f56a9633d3" Mar 10 16:05:27 crc kubenswrapper[4749]: I0310 16:05:27.622008 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" path="/var/lib/kubelet/pods/3a8eedaf-b39b-441a-b964-b33d943c8ae4/volumes" Mar 10 16:05:30 crc kubenswrapper[4749]: I0310 16:05:30.728936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-6rcf4" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.086659 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb"] Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.087765 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="extract-content" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.087781 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="extract-content" Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.087794 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="registry-server" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.087800 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="registry-server" Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.087811 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="extract-utilities" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.087817 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="extract-utilities" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.087950 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a8eedaf-b39b-441a-b964-b33d943c8ae4" containerName="registry-server" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.088434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.090355 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5qp4m" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.092502 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.093436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.096628 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8mfvl" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.099344 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.105041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.111486 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.112295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.114343 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-8s2vv" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.134254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.153109 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.154096 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.165216 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6w8j5" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.177674 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.180734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.182441 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.182729 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-z29gn" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.191936 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.196927 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.198082 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.203463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.204976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-x6gjc" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.220182 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.221225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.224976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bdgzm" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.225483 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.227544 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.237802 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.239554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.241983 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-r9r9p" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.243002 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.243732 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.245916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qvxd5" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.256527 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.257365 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.259411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqg6\" (UniqueName: \"kubernetes.io/projected/c7551811-07e9-4b2d-8367-8468bf446068-kube-api-access-vcqg6\") pod \"cinder-operator-controller-manager-984cd4dcf-w9j99\" (UID: \"c7551811-07e9-4b2d-8367-8468bf446068\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.259469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rln\" (UniqueName: \"kubernetes.io/projected/9aa25d5b-083e-4b81-ab1e-018e4305b8be-kube-api-access-92rln\") pod \"glance-operator-controller-manager-5964f64c48-rk5qv\" (UID: \"9aa25d5b-083e-4b81-ab1e-018e4305b8be\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.259553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdgc\" (UniqueName: \"kubernetes.io/projected/c0b12ff9-ef73-4f00-b0ed-655a5113714e-kube-api-access-zzdgc\") pod \"designate-operator-controller-manager-66d56f6ff4-cmqsz\" (UID: \"c0b12ff9-ef73-4f00-b0ed-655a5113714e\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.259589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpszw\" (UniqueName: \"kubernetes.io/projected/67fcadbc-6b7f-47b2-a723-544783895834-kube-api-access-fpszw\") pod \"barbican-operator-controller-manager-677bd678f7-cl6tb\" (UID: \"67fcadbc-6b7f-47b2-a723-544783895834\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.267332 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lbtmt" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.271349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.304997 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.334025 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.336156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.340832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wtp6j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ltxz\" (UniqueName: \"kubernetes.io/projected/ce6cef40-3b60-442d-86b0-ad5b583183a4-kube-api-access-6ltxz\") pod \"manila-operator-controller-manager-68f45f9d9f-5spgx\" (UID: \"ce6cef40-3b60-442d-86b0-ad5b583183a4\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wts7m\" (UniqueName: \"kubernetes.io/projected/33bd7186-cfb3-49b4-aaf1-a8015fe78fbd-kube-api-access-wts7m\") pod \"ironic-operator-controller-manager-6bbb499bbc-nt44l\" (UID: \"33bd7186-cfb3-49b4-aaf1-a8015fe78fbd\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqg6\" (UniqueName: \"kubernetes.io/projected/c7551811-07e9-4b2d-8367-8468bf446068-kube-api-access-vcqg6\") pod \"cinder-operator-controller-manager-984cd4dcf-w9j99\" (UID: \"c7551811-07e9-4b2d-8367-8468bf446068\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rln\" (UniqueName: \"kubernetes.io/projected/9aa25d5b-083e-4b81-ab1e-018e4305b8be-kube-api-access-92rln\") pod \"glance-operator-controller-manager-5964f64c48-rk5qv\" (UID: \"9aa25d5b-083e-4b81-ab1e-018e4305b8be\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njf7n\" (UniqueName: \"kubernetes.io/projected/4ad2d548-d5ed-4933-9a6d-1cb903434d41-kube-api-access-njf7n\") pod \"horizon-operator-controller-manager-6d9d6b584d-fcz88\" (UID: \"4ad2d548-d5ed-4933-9a6d-1cb903434d41\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpv4p\" (UniqueName: \"kubernetes.io/projected/944a1147-1517-4491-b7ee-1d0479e25c4c-kube-api-access-gpv4p\") pod \"heat-operator-controller-manager-77b6666d85-8wbhh\" (UID: \"944a1147-1517-4491-b7ee-1d0479e25c4c\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndn6r\" (UniqueName: \"kubernetes.io/projected/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-kube-api-access-ndn6r\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.361967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdgc\" (UniqueName: \"kubernetes.io/projected/c0b12ff9-ef73-4f00-b0ed-655a5113714e-kube-api-access-zzdgc\") pod \"designate-operator-controller-manager-66d56f6ff4-cmqsz\" (UID: \"c0b12ff9-ef73-4f00-b0ed-655a5113714e\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.362000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/b0053e43-866e-4c68-b4fe-edc5b10110f2-kube-api-access-p8wfj\") pod \"keystone-operator-controller-manager-684f77d66d-94s5n\" (UID: \"b0053e43-866e-4c68-b4fe-edc5b10110f2\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.362026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpszw\" (UniqueName: \"kubernetes.io/projected/67fcadbc-6b7f-47b2-a723-544783895834-kube-api-access-fpszw\") pod \"barbican-operator-controller-manager-677bd678f7-cl6tb\" (UID: \"67fcadbc-6b7f-47b2-a723-544783895834\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.362698 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.386068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpszw\" (UniqueName: \"kubernetes.io/projected/67fcadbc-6b7f-47b2-a723-544783895834-kube-api-access-fpszw\") pod \"barbican-operator-controller-manager-677bd678f7-cl6tb\" (UID: \"67fcadbc-6b7f-47b2-a723-544783895834\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.393133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rln\" (UniqueName: \"kubernetes.io/projected/9aa25d5b-083e-4b81-ab1e-018e4305b8be-kube-api-access-92rln\") pod \"glance-operator-controller-manager-5964f64c48-rk5qv\" (UID: \"9aa25d5b-083e-4b81-ab1e-018e4305b8be\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.394389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdgc\" (UniqueName: \"kubernetes.io/projected/c0b12ff9-ef73-4f00-b0ed-655a5113714e-kube-api-access-zzdgc\") pod \"designate-operator-controller-manager-66d56f6ff4-cmqsz\" (UID: \"c0b12ff9-ef73-4f00-b0ed-655a5113714e\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.394442 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.395469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.399539 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vpg5h" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.400487 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqg6\" (UniqueName: \"kubernetes.io/projected/c7551811-07e9-4b2d-8367-8468bf446068-kube-api-access-vcqg6\") pod \"cinder-operator-controller-manager-984cd4dcf-w9j99\" (UID: \"c7551811-07e9-4b2d-8367-8468bf446068\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.413884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.422747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.427890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.435596 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.436811 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.457524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.458493 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467397 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/b0053e43-866e-4c68-b4fe-edc5b10110f2-kube-api-access-p8wfj\") pod \"keystone-operator-controller-manager-684f77d66d-94s5n\" (UID: \"b0053e43-866e-4c68-b4fe-edc5b10110f2\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ltxz\" (UniqueName: \"kubernetes.io/projected/ce6cef40-3b60-442d-86b0-ad5b583183a4-kube-api-access-6ltxz\") pod \"manila-operator-controller-manager-68f45f9d9f-5spgx\" (UID: \"ce6cef40-3b60-442d-86b0-ad5b583183a4\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wts7m\" (UniqueName: \"kubernetes.io/projected/33bd7186-cfb3-49b4-aaf1-a8015fe78fbd-kube-api-access-wts7m\") pod \"ironic-operator-controller-manager-6bbb499bbc-nt44l\" (UID: \"33bd7186-cfb3-49b4-aaf1-a8015fe78fbd\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njf7n\" (UniqueName: \"kubernetes.io/projected/4ad2d548-d5ed-4933-9a6d-1cb903434d41-kube-api-access-njf7n\") pod \"horizon-operator-controller-manager-6d9d6b584d-fcz88\" (UID: \"4ad2d548-d5ed-4933-9a6d-1cb903434d41\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99k6s\" (UniqueName: \"kubernetes.io/projected/6984ff81-5091-4cb9-b665-9dcd5544e193-kube-api-access-99k6s\") pod \"mariadb-operator-controller-manager-658d4cdd5-rdtpp\" (UID: \"6984ff81-5091-4cb9-b665-9dcd5544e193\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpv4p\" (UniqueName: \"kubernetes.io/projected/944a1147-1517-4491-b7ee-1d0479e25c4c-kube-api-access-gpv4p\") pod \"heat-operator-controller-manager-77b6666d85-8wbhh\" (UID: \"944a1147-1517-4491-b7ee-1d0479e25c4c\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.467979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndn6r\" (UniqueName: \"kubernetes.io/projected/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-kube-api-access-ndn6r\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.468526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.468538 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.468752 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert podName:b4cb9d6b-00f0-478e-a275-2720e6f90e8a nodeName:}" failed. No retries permitted until 2026-03-10 16:05:50.968736352 +0000 UTC m=+1048.090602039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert") pod "infra-operator-controller-manager-5995f4446f-5r86d" (UID: "b4cb9d6b-00f0-478e-a275-2720e6f90e8a") : secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.471733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.474707 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jn7z7" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.479036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-p7wq5" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.479530 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.487422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njf7n\" (UniqueName: \"kubernetes.io/projected/4ad2d548-d5ed-4933-9a6d-1cb903434d41-kube-api-access-njf7n\") pod \"horizon-operator-controller-manager-6d9d6b584d-fcz88\" (UID: \"4ad2d548-d5ed-4933-9a6d-1cb903434d41\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.488045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndn6r\" (UniqueName: \"kubernetes.io/projected/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-kube-api-access-ndn6r\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.509865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8wfj\" (UniqueName: \"kubernetes.io/projected/b0053e43-866e-4c68-b4fe-edc5b10110f2-kube-api-access-p8wfj\") pod \"keystone-operator-controller-manager-684f77d66d-94s5n\" (UID: \"b0053e43-866e-4c68-b4fe-edc5b10110f2\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.513872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ltxz\" (UniqueName: \"kubernetes.io/projected/ce6cef40-3b60-442d-86b0-ad5b583183a4-kube-api-access-6ltxz\") pod \"manila-operator-controller-manager-68f45f9d9f-5spgx\" (UID: \"ce6cef40-3b60-442d-86b0-ad5b583183a4\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.514431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpv4p\" (UniqueName: \"kubernetes.io/projected/944a1147-1517-4491-b7ee-1d0479e25c4c-kube-api-access-gpv4p\") pod \"heat-operator-controller-manager-77b6666d85-8wbhh\" (UID: \"944a1147-1517-4491-b7ee-1d0479e25c4c\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.515566 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.516082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wts7m\" (UniqueName: \"kubernetes.io/projected/33bd7186-cfb3-49b4-aaf1-a8015fe78fbd-kube-api-access-wts7m\") pod \"ironic-operator-controller-manager-6bbb499bbc-nt44l\" (UID: \"33bd7186-cfb3-49b4-aaf1-a8015fe78fbd\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.524966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.541408 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.544081 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.549604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.551597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.551698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.563281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.563433 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.563500 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-c296m" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.564918 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rlmzj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.570563 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.570995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.571164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99k6s\" (UniqueName: \"kubernetes.io/projected/6984ff81-5091-4cb9-b665-9dcd5544e193-kube-api-access-99k6s\") pod \"mariadb-operator-controller-manager-658d4cdd5-rdtpp\" (UID: \"6984ff81-5091-4cb9-b665-9dcd5544e193\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.571201 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jl2c\" (UniqueName: \"kubernetes.io/projected/86e55c7e-3719-4d43-9803-ec8185965320-kube-api-access-9jl2c\") pod \"nova-operator-controller-manager-569cc54c5-kgb5j\" (UID: \"86e55c7e-3719-4d43-9803-ec8185965320\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.571495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpsfg\" (UniqueName: \"kubernetes.io/projected/3747cd39-1cb6-439f-8548-41e8f2a609f4-kube-api-access-rpsfg\") pod \"octavia-operator-controller-manager-5f4f55cb5c-gcgcm\" (UID: \"3747cd39-1cb6-439f-8548-41e8f2a609f4\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.571532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctpqk\" (UniqueName: \"kubernetes.io/projected/6b9320db-2215-4964-bbd5-7437a092fe31-kube-api-access-ctpqk\") pod \"neutron-operator-controller-manager-776c5696bf-986pw\" (UID: \"6b9320db-2215-4964-bbd5-7437a092fe31\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.571851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.578185 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-df9jw" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.585370 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.586428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.589922 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hqlk4" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.608594 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.617518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99k6s\" (UniqueName: \"kubernetes.io/projected/6984ff81-5091-4cb9-b665-9dcd5544e193-kube-api-access-99k6s\") pod \"mariadb-operator-controller-manager-658d4cdd5-rdtpp\" (UID: \"6984ff81-5091-4cb9-b665-9dcd5544e193\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.618393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.632809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.660620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.660797 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.677237 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678049 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t55jt\" (UniqueName: \"kubernetes.io/projected/40b9eefc-cf39-40d7-8f08-415714ea31d9-kube-api-access-t55jt\") pod \"ovn-operator-controller-manager-bbc5b68f9-7kfw8\" (UID: \"40b9eefc-cf39-40d7-8f08-415714ea31d9\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctpqk\" (UniqueName: \"kubernetes.io/projected/6b9320db-2215-4964-bbd5-7437a092fe31-kube-api-access-ctpqk\") pod \"neutron-operator-controller-manager-776c5696bf-986pw\" (UID: \"6b9320db-2215-4964-bbd5-7437a092fe31\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkzw6\" (UniqueName: \"kubernetes.io/projected/5a87391b-1b62-4214-ae0d-07c29e9e5efa-kube-api-access-jkzw6\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jl2c\" (UniqueName: \"kubernetes.io/projected/86e55c7e-3719-4d43-9803-ec8185965320-kube-api-access-9jl2c\") pod \"nova-operator-controller-manager-569cc54c5-kgb5j\" (UID: \"86e55c7e-3719-4d43-9803-ec8185965320\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllgl\" (UniqueName: \"kubernetes.io/projected/fe1af8b4-2a44-478b-9936-4e3fe4d90612-kube-api-access-fllgl\") pod \"swift-operator-controller-manager-677c674df7-v6kk4\" (UID: \"fe1af8b4-2a44-478b-9936-4e3fe4d90612\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpsfg\" (UniqueName: \"kubernetes.io/projected/3747cd39-1cb6-439f-8548-41e8f2a609f4-kube-api-access-rpsfg\") pod \"octavia-operator-controller-manager-5f4f55cb5c-gcgcm\" (UID: \"3747cd39-1cb6-439f-8548-41e8f2a609f4\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.678676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjv7k\" (UniqueName: \"kubernetes.io/projected/369202bf-81ec-4cf9-8540-c6a05a2447aa-kube-api-access-bjv7k\") pod \"placement-operator-controller-manager-574d45c66c-mwm5j\" (UID: \"369202bf-81ec-4cf9-8540-c6a05a2447aa\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.691431 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-g67fp" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.692252 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.706942 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.707720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.710688 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-pcg4t" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.727888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctpqk\" (UniqueName: \"kubernetes.io/projected/6b9320db-2215-4964-bbd5-7437a092fe31-kube-api-access-ctpqk\") pod \"neutron-operator-controller-manager-776c5696bf-986pw\" (UID: \"6b9320db-2215-4964-bbd5-7437a092fe31\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.729758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.730481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpsfg\" (UniqueName: \"kubernetes.io/projected/3747cd39-1cb6-439f-8548-41e8f2a609f4-kube-api-access-rpsfg\") pod \"octavia-operator-controller-manager-5f4f55cb5c-gcgcm\" (UID: \"3747cd39-1cb6-439f-8548-41e8f2a609f4\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.746674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jl2c\" (UniqueName: \"kubernetes.io/projected/86e55c7e-3719-4d43-9803-ec8185965320-kube-api-access-9jl2c\") pod \"nova-operator-controller-manager-569cc54c5-kgb5j\" (UID: \"86e55c7e-3719-4d43-9803-ec8185965320\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.782918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzw6\" (UniqueName: \"kubernetes.io/projected/5a87391b-1b62-4214-ae0d-07c29e9e5efa-kube-api-access-jkzw6\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.783038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74kgb\" (UniqueName: \"kubernetes.io/projected/57b4a19f-1a4b-4db9-8e25-fb3ed92e1388-kube-api-access-74kgb\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-tzm5k\" (UID: \"57b4a19f-1a4b-4db9-8e25-fb3ed92e1388\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.783122 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllgl\" (UniqueName: \"kubernetes.io/projected/fe1af8b4-2a44-478b-9936-4e3fe4d90612-kube-api-access-fllgl\") pod \"swift-operator-controller-manager-677c674df7-v6kk4\" (UID: \"fe1af8b4-2a44-478b-9936-4e3fe4d90612\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.783199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjv7k\" (UniqueName: \"kubernetes.io/projected/369202bf-81ec-4cf9-8540-c6a05a2447aa-kube-api-access-bjv7k\") pod \"placement-operator-controller-manager-574d45c66c-mwm5j\" (UID: \"369202bf-81ec-4cf9-8540-c6a05a2447aa\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.783241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t55jt\" (UniqueName: \"kubernetes.io/projected/40b9eefc-cf39-40d7-8f08-415714ea31d9-kube-api-access-t55jt\") pod \"ovn-operator-controller-manager-bbc5b68f9-7kfw8\" (UID: \"40b9eefc-cf39-40d7-8f08-415714ea31d9\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.783277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.783463 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.783531 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert podName:5a87391b-1b62-4214-ae0d-07c29e9e5efa nodeName:}" failed. No retries permitted until 2026-03-10 16:05:51.283510409 +0000 UTC m=+1048.405376096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" (UID: "5a87391b-1b62-4214-ae0d-07c29e9e5efa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.797548 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.820486 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.821626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.825686 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7547z" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.836865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjv7k\" (UniqueName: \"kubernetes.io/projected/369202bf-81ec-4cf9-8540-c6a05a2447aa-kube-api-access-bjv7k\") pod \"placement-operator-controller-manager-574d45c66c-mwm5j\" (UID: \"369202bf-81ec-4cf9-8540-c6a05a2447aa\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.839498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.839576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllgl\" (UniqueName: \"kubernetes.io/projected/fe1af8b4-2a44-478b-9936-4e3fe4d90612-kube-api-access-fllgl\") pod \"swift-operator-controller-manager-677c674df7-v6kk4\" (UID: \"fe1af8b4-2a44-478b-9936-4e3fe4d90612\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.841295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.845228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t55jt\" (UniqueName: \"kubernetes.io/projected/40b9eefc-cf39-40d7-8f08-415714ea31d9-kube-api-access-t55jt\") pod \"ovn-operator-controller-manager-bbc5b68f9-7kfw8\" (UID: \"40b9eefc-cf39-40d7-8f08-415714ea31d9\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.850646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkzw6\" (UniqueName: \"kubernetes.io/projected/5a87391b-1b62-4214-ae0d-07c29e9e5efa-kube-api-access-jkzw6\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.861323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.863985 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.870700 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.872252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.874490 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.876109 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.876668 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.877238 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mr4fj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.880311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.884729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74kgb\" (UniqueName: \"kubernetes.io/projected/57b4a19f-1a4b-4db9-8e25-fb3ed92e1388-kube-api-access-74kgb\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-tzm5k\" (UID: \"57b4a19f-1a4b-4db9-8e25-fb3ed92e1388\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.884803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb6f7\" (UniqueName: \"kubernetes.io/projected/680e1c04-e829-45a0-a323-4d40ec62b076-kube-api-access-tb6f7\") pod \"test-operator-controller-manager-5c5cb9c4d7-xrkmh\" (UID: \"680e1c04-e829-45a0-a323-4d40ec62b076\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.924664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74kgb\" (UniqueName: \"kubernetes.io/projected/57b4a19f-1a4b-4db9-8e25-fb3ed92e1388-kube-api-access-74kgb\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-tzm5k\" (UID: \"57b4a19f-1a4b-4db9-8e25-fb3ed92e1388\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.935507 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.940452 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.941391 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.944536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bk7mj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.949964 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg"] Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.986258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.986341 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.986416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb6f7\" (UniqueName: \"kubernetes.io/projected/680e1c04-e829-45a0-a323-4d40ec62b076-kube-api-access-tb6f7\") pod \"test-operator-controller-manager-5c5cb9c4d7-xrkmh\" (UID: \"680e1c04-e829-45a0-a323-4d40ec62b076\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.986490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxtl\" (UniqueName: \"kubernetes.io/projected/281e49ea-bf93-4ad0-8081-eced425b1a7e-kube-api-access-fnxtl\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.986510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.986567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s8sl\" (UniqueName: \"kubernetes.io/projected/2530b5b5-5bd6-430a-8646-f77cd6f4ceae-kube-api-access-7s8sl\") pod \"watcher-operator-controller-manager-6dd88c6f67-fqvkq\" (UID: \"2530b5b5-5bd6-430a-8646-f77cd6f4ceae\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.986732 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:50 crc kubenswrapper[4749]: E0310 16:05:50.986780 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert podName:b4cb9d6b-00f0-478e-a275-2720e6f90e8a nodeName:}" failed. No retries permitted until 2026-03-10 16:05:51.986761152 +0000 UTC m=+1049.108626839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert") pod "infra-operator-controller-manager-5995f4446f-5r86d" (UID: "b4cb9d6b-00f0-478e-a275-2720e6f90e8a") : secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:50 crc kubenswrapper[4749]: I0310 16:05:50.990129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.029807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb6f7\" (UniqueName: \"kubernetes.io/projected/680e1c04-e829-45a0-a323-4d40ec62b076-kube-api-access-tb6f7\") pod \"test-operator-controller-manager-5c5cb9c4d7-xrkmh\" (UID: \"680e1c04-e829-45a0-a323-4d40ec62b076\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.088536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s8sl\" (UniqueName: \"kubernetes.io/projected/2530b5b5-5bd6-430a-8646-f77cd6f4ceae-kube-api-access-7s8sl\") pod \"watcher-operator-controller-manager-6dd88c6f67-fqvkq\" (UID: \"2530b5b5-5bd6-430a-8646-f77cd6f4ceae\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.088609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.088656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnshn\" (UniqueName: \"kubernetes.io/projected/f35968c8-813f-473a-9bfc-46a3ff38318e-kube-api-access-cnshn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l8vqg\" (UID: \"f35968c8-813f-473a-9bfc-46a3ff38318e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.088786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxtl\" (UniqueName: \"kubernetes.io/projected/281e49ea-bf93-4ad0-8081-eced425b1a7e-kube-api-access-fnxtl\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.088813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.088971 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.089019 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:51.58900379 +0000 UTC m=+1048.710869477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "webhook-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.089361 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.089412 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:51.589404062 +0000 UTC m=+1048.711269749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "metrics-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.107310 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxtl\" (UniqueName: \"kubernetes.io/projected/281e49ea-bf93-4ad0-8081-eced425b1a7e-kube-api-access-fnxtl\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.107972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s8sl\" (UniqueName: \"kubernetes.io/projected/2530b5b5-5bd6-430a-8646-f77cd6f4ceae-kube-api-access-7s8sl\") pod \"watcher-operator-controller-manager-6dd88c6f67-fqvkq\" (UID: \"2530b5b5-5bd6-430a-8646-f77cd6f4ceae\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.179687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.188601 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.190557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnshn\" (UniqueName: \"kubernetes.io/projected/f35968c8-813f-473a-9bfc-46a3ff38318e-kube-api-access-cnshn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l8vqg\" (UID: \"f35968c8-813f-473a-9bfc-46a3ff38318e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.212006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.216932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnshn\" (UniqueName: \"kubernetes.io/projected/f35968c8-813f-473a-9bfc-46a3ff38318e-kube-api-access-cnshn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l8vqg\" (UID: \"f35968c8-813f-473a-9bfc-46a3ff38318e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.284288 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv"] Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.291175 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb"] Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.291843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.292026 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.292122 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert podName:5a87391b-1b62-4214-ae0d-07c29e9e5efa nodeName:}" failed. No retries permitted until 2026-03-10 16:05:52.292091639 +0000 UTC m=+1049.413957326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" (UID: "5a87391b-1b62-4214-ae0d-07c29e9e5efa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: W0310 16:05:51.301833 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa25d5b_083e_4b81_ab1e_018e4305b8be.slice/crio-1d3787ee0ad1c77f19e388d13f179709f447c39fd1064b31ec0d19cb1566b02e WatchSource:0}: Error finding container 1d3787ee0ad1c77f19e388d13f179709f447c39fd1064b31ec0d19cb1566b02e: Status 404 returned error can't find the container with id 1d3787ee0ad1c77f19e388d13f179709f447c39fd1064b31ec0d19cb1566b02e Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.316567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.467363 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99"] Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.557665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" event={"ID":"c7551811-07e9-4b2d-8367-8468bf446068","Type":"ContainerStarted","Data":"569362c7344bb3fd4238728af9298b831ac8574448fd33c78902d3d57a3f6b0c"} Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.562472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" event={"ID":"67fcadbc-6b7f-47b2-a723-544783895834","Type":"ContainerStarted","Data":"21cb6013329850b4b1ce14586ea7e4dbdc7fb9b751f79e163c3a202d83b1b1f1"} Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.563484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" event={"ID":"9aa25d5b-083e-4b81-ab1e-018e4305b8be","Type":"ContainerStarted","Data":"1d3787ee0ad1c77f19e388d13f179709f447c39fd1064b31ec0d19cb1566b02e"} Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.600730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.600817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.600952 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.600986 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.601049 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:52.601024065 +0000 UTC m=+1049.722889752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "webhook-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: E0310 16:05:51.601082 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:52.601064016 +0000 UTC m=+1049.722929703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "metrics-server-cert" not found Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.837918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88"] Mar 10 16:05:51 crc kubenswrapper[4749]: W0310 16:05:51.843093 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad2d548_d5ed_4933_9a6d_1cb903434d41.slice/crio-09a6a5aa51dafc4b51f45b7357918dfcdd2ac9fb14736fb2be6065d22a6d3c5e WatchSource:0}: Error finding container 09a6a5aa51dafc4b51f45b7357918dfcdd2ac9fb14736fb2be6065d22a6d3c5e: Status 404 returned error can't find the container with id 09a6a5aa51dafc4b51f45b7357918dfcdd2ac9fb14736fb2be6065d22a6d3c5e Mar 10 16:05:51 crc kubenswrapper[4749]: I0310 16:05:51.997824 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.003943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.011960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.012232 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.012287 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert podName:b4cb9d6b-00f0-478e-a275-2720e6f90e8a nodeName:}" failed. No retries permitted until 2026-03-10 16:05:54.012269632 +0000 UTC m=+1051.134135319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert") pod "infra-operator-controller-manager-5995f4446f-5r86d" (UID: "b4cb9d6b-00f0-478e-a275-2720e6f90e8a") : secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.027982 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx"] Mar 10 16:05:52 crc kubenswrapper[4749]: W0310 16:05:52.030524 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e55c7e_3719_4d43_9803_ec8185965320.slice/crio-4cd749cdd2ed79e3eae766c6dcf5283f82e131179bf932674cb7c2ae9a566e5f WatchSource:0}: Error finding container 4cd749cdd2ed79e3eae766c6dcf5283f82e131179bf932674cb7c2ae9a566e5f: Status 404 returned error can't find the container with id 4cd749cdd2ed79e3eae766c6dcf5283f82e131179bf932674cb7c2ae9a566e5f Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.040771 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.054431 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.066905 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.079744 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw"] Mar 10 16:05:52 crc kubenswrapper[4749]: W0310 16:05:52.087337 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944a1147_1517_4491_b7ee_1d0479e25c4c.slice/crio-dc9e583034488188b4029a2ab5b7ed2fc1c3a49fc80224079e8d2c2b69664ff4 WatchSource:0}: Error finding container dc9e583034488188b4029a2ab5b7ed2fc1c3a49fc80224079e8d2c2b69664ff4: Status 404 returned error can't find the container with id dc9e583034488188b4029a2ab5b7ed2fc1c3a49fc80224079e8d2c2b69664ff4 Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.088221 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4"] Mar 10 16:05:52 crc kubenswrapper[4749]: W0310 16:05:52.090785 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod369202bf_81ec_4cf9_8540_c6a05a2447aa.slice/crio-9ae16f10261052bff29be721d175424219f4b6d394ffa942ad30f74080dedad4 WatchSource:0}: Error finding container 9ae16f10261052bff29be721d175424219f4b6d394ffa942ad30f74080dedad4: Status 404 returned error can't find the container with id 9ae16f10261052bff29be721d175424219f4b6d394ffa942ad30f74080dedad4 Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.093782 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjv7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-mwm5j_openstack-operators(369202bf-81ec-4cf9-8540-c6a05a2447aa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.094982 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" podUID="369202bf-81ec-4cf9-8540-c6a05a2447aa" Mar 10 16:05:52 crc kubenswrapper[4749]: W0310 16:05:52.095069 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3747cd39_1cb6_439f_8548_41e8f2a609f4.slice/crio-aed5483bb56f23ee2effb3bb073c7f0a56c956fd723518e1ae3c99a3b63761ab WatchSource:0}: Error finding container aed5483bb56f23ee2effb3bb073c7f0a56c956fd723518e1ae3c99a3b63761ab: Status 404 returned error can't find the container with id aed5483bb56f23ee2effb3bb073c7f0a56c956fd723518e1ae3c99a3b63761ab Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.115990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.133080 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.137190 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.280018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.292701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.300702 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.316096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.316352 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.316434 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert podName:5a87391b-1b62-4214-ae0d-07c29e9e5efa nodeName:}" failed. No retries permitted until 2026-03-10 16:05:54.316413846 +0000 UTC m=+1051.438279533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" (UID: "5a87391b-1b62-4214-ae0d-07c29e9e5efa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: W0310 16:05:52.322676 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2530b5b5_5bd6_430a_8646_f77cd6f4ceae.slice/crio-90e5b255ba76c9f69596eb060a6090ab990af0ab1c636df14999584efdc8ec50 WatchSource:0}: Error finding container 90e5b255ba76c9f69596eb060a6090ab990af0ab1c636df14999584efdc8ec50: Status 404 returned error can't find the container with id 90e5b255ba76c9f69596eb060a6090ab990af0ab1c636df14999584efdc8ec50 Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.328920 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8"] Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.350746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh"] Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.371670 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cnshn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l8vqg_openstack-operators(f35968c8-813f-473a-9bfc-46a3ff38318e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.372754 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" podUID="f35968c8-813f-473a-9bfc-46a3ff38318e" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.393975 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-74kgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-tzm5k_openstack-operators(57b4a19f-1a4b-4db9-8e25-fb3ed92e1388): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.397331 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" podUID="57b4a19f-1a4b-4db9-8e25-fb3ed92e1388" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.447521 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t55jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-7kfw8_openstack-operators(40b9eefc-cf39-40d7-8f08-415714ea31d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.449233 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" podUID="40b9eefc-cf39-40d7-8f08-415714ea31d9" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.481287 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tb6f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-xrkmh_openstack-operators(680e1c04-e829-45a0-a323-4d40ec62b076): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.482497 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" podUID="680e1c04-e829-45a0-a323-4d40ec62b076" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.572411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" event={"ID":"40b9eefc-cf39-40d7-8f08-415714ea31d9","Type":"ContainerStarted","Data":"a6a00d79db88ecf2d0803db31a2bd36d016754342b02b66235ee28793c807dcd"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.574412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" event={"ID":"ce6cef40-3b60-442d-86b0-ad5b583183a4","Type":"ContainerStarted","Data":"bf674604d67ce65af7bbcd9c1f02cf3a63a820f65f60d3cba1975caca2723074"} Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.575005 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" podUID="40b9eefc-cf39-40d7-8f08-415714ea31d9" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.579419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" event={"ID":"2530b5b5-5bd6-430a-8646-f77cd6f4ceae","Type":"ContainerStarted","Data":"90e5b255ba76c9f69596eb060a6090ab990af0ab1c636df14999584efdc8ec50"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.581468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" event={"ID":"6984ff81-5091-4cb9-b665-9dcd5544e193","Type":"ContainerStarted","Data":"4f72fd2690635462f1381b3e95aeea83ea1999d7e8a4e660af427b94cd3a9688"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.584914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" event={"ID":"680e1c04-e829-45a0-a323-4d40ec62b076","Type":"ContainerStarted","Data":"38fb6edda452764bf0faad7d8c5bd2613bd002f5ef5e6ea7dea02bdefaf2aa67"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.586524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" event={"ID":"b0053e43-866e-4c68-b4fe-edc5b10110f2","Type":"ContainerStarted","Data":"467ce1bb8be6534f73e3c8524bc64f3272777c75db93df4612900b480199d029"} Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.587271 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" podUID="680e1c04-e829-45a0-a323-4d40ec62b076" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.591360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" event={"ID":"369202bf-81ec-4cf9-8540-c6a05a2447aa","Type":"ContainerStarted","Data":"9ae16f10261052bff29be721d175424219f4b6d394ffa942ad30f74080dedad4"} Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.598444 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" podUID="369202bf-81ec-4cf9-8540-c6a05a2447aa" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.605940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" event={"ID":"944a1147-1517-4491-b7ee-1d0479e25c4c","Type":"ContainerStarted","Data":"dc9e583034488188b4029a2ab5b7ed2fc1c3a49fc80224079e8d2c2b69664ff4"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.607887 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" event={"ID":"3747cd39-1cb6-439f-8548-41e8f2a609f4","Type":"ContainerStarted","Data":"aed5483bb56f23ee2effb3bb073c7f0a56c956fd723518e1ae3c99a3b63761ab"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.611121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" event={"ID":"6b9320db-2215-4964-bbd5-7437a092fe31","Type":"ContainerStarted","Data":"739e64e8c57da4fe2bea41421cb39d1631547779f1788481870b4ee2c6632225"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.614939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" event={"ID":"c0b12ff9-ef73-4f00-b0ed-655a5113714e","Type":"ContainerStarted","Data":"6aa6c57edab361c6520911ad7a82e8b8f6e5ac004a5d3751b7a1c6f21ae0dac8"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.617481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" event={"ID":"57b4a19f-1a4b-4db9-8e25-fb3ed92e1388","Type":"ContainerStarted","Data":"d7d089df678724c34b47c5e4337cbf1fddda83cc79ad581630a95b122f0ba0b7"} Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.619683 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" podUID="57b4a19f-1a4b-4db9-8e25-fb3ed92e1388" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.620586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" event={"ID":"4ad2d548-d5ed-4933-9a6d-1cb903434d41","Type":"ContainerStarted","Data":"09a6a5aa51dafc4b51f45b7357918dfcdd2ac9fb14736fb2be6065d22a6d3c5e"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.621705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.621823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.621956 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.622009 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:54.621994109 +0000 UTC m=+1051.743859786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "metrics-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.623209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" event={"ID":"33bd7186-cfb3-49b4-aaf1-a8015fe78fbd","Type":"ContainerStarted","Data":"7055cfd5c0f05e97d084ac887e45c6c5b497031d47e71b301e93b97871ed1cd8"} Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.623819 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.623854 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:54.62384498 +0000 UTC m=+1051.745710667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "webhook-server-cert" not found Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.631218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" event={"ID":"f35968c8-813f-473a-9bfc-46a3ff38318e","Type":"ContainerStarted","Data":"928cd38ed97882736f4f089f80c5a0c99e7e8e08b5b6150ff89b508227116657"} Mar 10 16:05:52 crc kubenswrapper[4749]: E0310 16:05:52.633881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" podUID="f35968c8-813f-473a-9bfc-46a3ff38318e" Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.634485 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" event={"ID":"86e55c7e-3719-4d43-9803-ec8185965320","Type":"ContainerStarted","Data":"4cd749cdd2ed79e3eae766c6dcf5283f82e131179bf932674cb7c2ae9a566e5f"} Mar 10 16:05:52 crc kubenswrapper[4749]: I0310 16:05:52.638906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" event={"ID":"fe1af8b4-2a44-478b-9936-4e3fe4d90612","Type":"ContainerStarted","Data":"9081619adaf87e8c45a21729e19c4ebfe3d45bfd97c6ab3f7abb66160de53183"} Mar 10 16:05:53 crc kubenswrapper[4749]: E0310 16:05:53.664487 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" podUID="57b4a19f-1a4b-4db9-8e25-fb3ed92e1388" Mar 10 16:05:53 crc kubenswrapper[4749]: E0310 16:05:53.664822 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" podUID="40b9eefc-cf39-40d7-8f08-415714ea31d9" Mar 10 16:05:53 crc kubenswrapper[4749]: E0310 16:05:53.664826 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" podUID="680e1c04-e829-45a0-a323-4d40ec62b076" Mar 10 16:05:53 crc kubenswrapper[4749]: E0310 16:05:53.665057 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" podUID="f35968c8-813f-473a-9bfc-46a3ff38318e" Mar 10 16:05:53 crc kubenswrapper[4749]: E0310 16:05:53.666287 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" podUID="369202bf-81ec-4cf9-8540-c6a05a2447aa" Mar 10 16:05:54 crc kubenswrapper[4749]: I0310 16:05:54.046351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.046548 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.046666 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert podName:b4cb9d6b-00f0-478e-a275-2720e6f90e8a nodeName:}" failed. No retries permitted until 2026-03-10 16:05:58.046641909 +0000 UTC m=+1055.168507596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert") pod "infra-operator-controller-manager-5995f4446f-5r86d" (UID: "b4cb9d6b-00f0-478e-a275-2720e6f90e8a") : secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: I0310 16:05:54.350804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.350995 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.351119 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert podName:5a87391b-1b62-4214-ae0d-07c29e9e5efa nodeName:}" failed. No retries permitted until 2026-03-10 16:05:58.351083332 +0000 UTC m=+1055.472949019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" (UID: "5a87391b-1b62-4214-ae0d-07c29e9e5efa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: I0310 16:05:54.654711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:54 crc kubenswrapper[4749]: I0310 16:05:54.654858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.654959 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.655226 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.655312 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:58.655293737 +0000 UTC m=+1055.777159424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "metrics-server-cert" not found Mar 10 16:05:54 crc kubenswrapper[4749]: E0310 16:05:54.655486 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:05:58.655466072 +0000 UTC m=+1055.777331769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: I0310 16:05:58.112886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.113095 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.113405 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert podName:b4cb9d6b-00f0-478e-a275-2720e6f90e8a nodeName:}" failed. No retries permitted until 2026-03-10 16:06:06.11335907 +0000 UTC m=+1063.235224757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert") pod "infra-operator-controller-manager-5995f4446f-5r86d" (UID: "b4cb9d6b-00f0-478e-a275-2720e6f90e8a") : secret "infra-operator-webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: I0310 16:05:58.418448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.418923 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.418969 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert podName:5a87391b-1b62-4214-ae0d-07c29e9e5efa nodeName:}" failed. No retries permitted until 2026-03-10 16:06:06.418955054 +0000 UTC m=+1063.540820741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" (UID: "5a87391b-1b62-4214-ae0d-07c29e9e5efa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: I0310 16:05:58.723168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:58 crc kubenswrapper[4749]: I0310 16:05:58.723296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.723403 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.723479 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:06:06.723459438 +0000 UTC m=+1063.845325125 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "webhook-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.723517 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 16:05:58 crc kubenswrapper[4749]: E0310 16:05:58.723584 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:06:06.72356308 +0000 UTC m=+1063.845428847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "metrics-server-cert" not found Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.138424 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552646-fcx9s"] Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.140727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.143121 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.143606 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.143697 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.149789 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-fcx9s"] Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.248492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbw5s\" (UniqueName: \"kubernetes.io/projected/31d12a68-0f40-4da5-8662-2228ed4812e4-kube-api-access-pbw5s\") pod \"auto-csr-approver-29552646-fcx9s\" (UID: \"31d12a68-0f40-4da5-8662-2228ed4812e4\") " pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.350262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbw5s\" (UniqueName: \"kubernetes.io/projected/31d12a68-0f40-4da5-8662-2228ed4812e4-kube-api-access-pbw5s\") pod \"auto-csr-approver-29552646-fcx9s\" (UID: \"31d12a68-0f40-4da5-8662-2228ed4812e4\") " pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.381975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbw5s\" (UniqueName: \"kubernetes.io/projected/31d12a68-0f40-4da5-8662-2228ed4812e4-kube-api-access-pbw5s\") pod \"auto-csr-approver-29552646-fcx9s\" (UID: \"31d12a68-0f40-4da5-8662-2228ed4812e4\") " pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:00 crc kubenswrapper[4749]: I0310 16:06:00.470969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.741398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" event={"ID":"67fcadbc-6b7f-47b2-a723-544783895834","Type":"ContainerStarted","Data":"5aa8d36e4c2593e2b24ecf651444d352b27ff260a4c90a4ae5a50eebdcfca5e5"} Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.742033 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.747262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" event={"ID":"6b9320db-2215-4964-bbd5-7437a092fe31","Type":"ContainerStarted","Data":"8f211015dfe8d8b3b4c17ca3a1018d39c251e526448722f2c01dbca75b5ab646"} Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.747429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.751118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" event={"ID":"944a1147-1517-4491-b7ee-1d0479e25c4c","Type":"ContainerStarted","Data":"cf99508be334e6cb414f2499f978d5678e093ff8ce47eb6dc4942d6d7e03ff94"} Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.751435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.764127 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" podStartSLOduration=2.750460833 podStartE2EDuration="13.76411093s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:51.301496139 +0000 UTC m=+1048.423361826" lastFinishedPulling="2026-03-10 16:06:02.315146236 +0000 UTC m=+1059.437011923" observedRunningTime="2026-03-10 16:06:03.755345167 +0000 UTC m=+1060.877210854" watchObservedRunningTime="2026-03-10 16:06:03.76411093 +0000 UTC m=+1060.885976617" Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.776416 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-fcx9s"] Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.784797 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" podStartSLOduration=2.571535502 podStartE2EDuration="13.784778051s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.093408176 +0000 UTC m=+1049.215273863" lastFinishedPulling="2026-03-10 16:06:03.306650715 +0000 UTC m=+1060.428516412" observedRunningTime="2026-03-10 16:06:03.778392015 +0000 UTC m=+1060.900257702" watchObservedRunningTime="2026-03-10 16:06:03.784778051 +0000 UTC m=+1060.906643748" Mar 10 16:06:03 crc kubenswrapper[4749]: W0310 16:06:03.785188 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d12a68_0f40_4da5_8662_2228ed4812e4.slice/crio-f6294a5faf1a136eca977f26a9aaeec760c9b5e5b48cd0999fb994d3a1269cb9 WatchSource:0}: Error finding container f6294a5faf1a136eca977f26a9aaeec760c9b5e5b48cd0999fb994d3a1269cb9: Status 404 returned error can't find the container with id f6294a5faf1a136eca977f26a9aaeec760c9b5e5b48cd0999fb994d3a1269cb9 Mar 10 16:06:03 crc kubenswrapper[4749]: I0310 16:06:03.798651 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" podStartSLOduration=2.515102341 podStartE2EDuration="13.798630854s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.083663086 +0000 UTC m=+1049.205528773" lastFinishedPulling="2026-03-10 16:06:03.367191599 +0000 UTC m=+1060.489057286" observedRunningTime="2026-03-10 16:06:03.794570143 +0000 UTC m=+1060.916435830" watchObservedRunningTime="2026-03-10 16:06:03.798630854 +0000 UTC m=+1060.920496541" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.762611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" event={"ID":"31d12a68-0f40-4da5-8662-2228ed4812e4","Type":"ContainerStarted","Data":"f6294a5faf1a136eca977f26a9aaeec760c9b5e5b48cd0999fb994d3a1269cb9"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.764220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" event={"ID":"86e55c7e-3719-4d43-9803-ec8185965320","Type":"ContainerStarted","Data":"0ab333253a7a03effbd4ca71679e845392d74a40d459c2d5cdf326531f07699a"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.764313 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.776004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" event={"ID":"4ad2d548-d5ed-4933-9a6d-1cb903434d41","Type":"ContainerStarted","Data":"0fa6584783dfba25069f288688a1b70602ae89d00caa6d0dd502b24ee9263353"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.776409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.785362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" event={"ID":"33bd7186-cfb3-49b4-aaf1-a8015fe78fbd","Type":"ContainerStarted","Data":"8a708c8e7259ed385c2ded3f7ac2bcb8d6b762305dc26982bcd311a00767e8c9"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.785457 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.790835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" event={"ID":"2530b5b5-5bd6-430a-8646-f77cd6f4ceae","Type":"ContainerStarted","Data":"cc0d1200b3fdeb3ecfe1cf0b215cd3f86c0c50bc9bfcd29d6d0c55da49016030"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.790941 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.798972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" event={"ID":"3747cd39-1cb6-439f-8548-41e8f2a609f4","Type":"ContainerStarted","Data":"71d0b77d0cbe0e18de617d209ebd7310e6bf1a6c040ed4c9a9cf7ec0a98402a2"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.799056 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.802106 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" podStartSLOduration=3.396623027 podStartE2EDuration="14.802081294s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.044341078 +0000 UTC m=+1049.166206775" lastFinishedPulling="2026-03-10 16:06:03.449799355 +0000 UTC m=+1060.571665042" observedRunningTime="2026-03-10 16:06:04.794190776 +0000 UTC m=+1061.916056473" watchObservedRunningTime="2026-03-10 16:06:04.802081294 +0000 UTC m=+1061.923946981" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.806658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" event={"ID":"c7551811-07e9-4b2d-8367-8468bf446068","Type":"ContainerStarted","Data":"cf6a92ddd0c090607498d978d7be45883143d93532adeb0127d22bc1525712d1"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.807416 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.814678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" event={"ID":"c0b12ff9-ef73-4f00-b0ed-655a5113714e","Type":"ContainerStarted","Data":"b737635e1fba5396d2602f76e50ec56a6b2c73ba969cf0a65e6c9a9dc5b5291b"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.814866 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.817688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" event={"ID":"b0053e43-866e-4c68-b4fe-edc5b10110f2","Type":"ContainerStarted","Data":"dc5fada1bce5ffc62e5e4d609966c3880bfc35d3668ec683a5e92d7e0601c924"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.817791 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.831410 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" podStartSLOduration=3.770279294 podStartE2EDuration="14.831389275s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.348872713 +0000 UTC m=+1049.470738400" lastFinishedPulling="2026-03-10 16:06:03.409982694 +0000 UTC m=+1060.531848381" observedRunningTime="2026-03-10 16:06:04.828553496 +0000 UTC m=+1061.950419183" watchObservedRunningTime="2026-03-10 16:06:04.831389275 +0000 UTC m=+1061.953254962" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.832319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" event={"ID":"fe1af8b4-2a44-478b-9936-4e3fe4d90612","Type":"ContainerStarted","Data":"3d4d554ba3b4c07be2322c4957f6bc29dd6f993911dab3685f6a048f6b79c754"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.832922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.834493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" event={"ID":"ce6cef40-3b60-442d-86b0-ad5b583183a4","Type":"ContainerStarted","Data":"665c8537f2ad02df10f2887f9489829a5f4f87b296c7c1a79e1ec10c3109f5e6"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.834930 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.844430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" event={"ID":"9aa25d5b-083e-4b81-ab1e-018e4305b8be","Type":"ContainerStarted","Data":"be1e75f6c426ab252959c627ef0326f5e3d26efa35a662038da439a2f65631d0"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.844511 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.846472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" event={"ID":"6984ff81-5091-4cb9-b665-9dcd5544e193","Type":"ContainerStarted","Data":"1c77d0cd3c93efd784d49dfdd1bf600029b344263eba36257eabd8fc31bb5ce4"} Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.846508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.852417 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" podStartSLOduration=3.566745804 podStartE2EDuration="14.852398405s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.081803475 +0000 UTC m=+1049.203669162" lastFinishedPulling="2026-03-10 16:06:03.367456036 +0000 UTC m=+1060.489321763" observedRunningTime="2026-03-10 16:06:04.848658972 +0000 UTC m=+1061.970524659" watchObservedRunningTime="2026-03-10 16:06:04.852398405 +0000 UTC m=+1061.974264092" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.871333 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" podStartSLOduration=3.411522079 podStartE2EDuration="14.871294378s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:51.846943248 +0000 UTC m=+1048.968808945" lastFinishedPulling="2026-03-10 16:06:03.306715537 +0000 UTC m=+1060.428581244" observedRunningTime="2026-03-10 16:06:04.868490751 +0000 UTC m=+1061.990356438" watchObservedRunningTime="2026-03-10 16:06:04.871294378 +0000 UTC m=+1061.993160065" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.907968 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" podStartSLOduration=3.632480762 podStartE2EDuration="14.907948412s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.090409273 +0000 UTC m=+1049.212274960" lastFinishedPulling="2026-03-10 16:06:03.365876883 +0000 UTC m=+1060.487742610" observedRunningTime="2026-03-10 16:06:04.904129047 +0000 UTC m=+1062.025994734" watchObservedRunningTime="2026-03-10 16:06:04.907948412 +0000 UTC m=+1062.029814099" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.940300 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" podStartSLOduration=3.59154424 podStartE2EDuration="14.940275277s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.013513486 +0000 UTC m=+1049.135379173" lastFinishedPulling="2026-03-10 16:06:03.362244523 +0000 UTC m=+1060.484110210" observedRunningTime="2026-03-10 16:06:04.934493227 +0000 UTC m=+1062.056358944" watchObservedRunningTime="2026-03-10 16:06:04.940275277 +0000 UTC m=+1062.062140964" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.955872 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" podStartSLOduration=3.484738244 podStartE2EDuration="14.955847037s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:51.306499257 +0000 UTC m=+1048.428364944" lastFinishedPulling="2026-03-10 16:06:02.77760806 +0000 UTC m=+1059.899473737" observedRunningTime="2026-03-10 16:06:04.950241422 +0000 UTC m=+1062.072107109" watchObservedRunningTime="2026-03-10 16:06:04.955847037 +0000 UTC m=+1062.077712724" Mar 10 16:06:04 crc kubenswrapper[4749]: I0310 16:06:04.992974 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" podStartSLOduration=3.713015772 podStartE2EDuration="14.992957105s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.081337193 +0000 UTC m=+1049.203202880" lastFinishedPulling="2026-03-10 16:06:03.361278526 +0000 UTC m=+1060.483144213" observedRunningTime="2026-03-10 16:06:04.9895826 +0000 UTC m=+1062.111448287" watchObservedRunningTime="2026-03-10 16:06:04.992957105 +0000 UTC m=+1062.114822792" Mar 10 16:06:05 crc kubenswrapper[4749]: I0310 16:06:05.046111 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" podStartSLOduration=3.753025108 podStartE2EDuration="15.046094055s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.087319108 +0000 UTC m=+1049.209184795" lastFinishedPulling="2026-03-10 16:06:03.380388055 +0000 UTC m=+1060.502253742" observedRunningTime="2026-03-10 16:06:05.026209355 +0000 UTC m=+1062.148075032" watchObservedRunningTime="2026-03-10 16:06:05.046094055 +0000 UTC m=+1062.167959742" Mar 10 16:06:05 crc kubenswrapper[4749]: I0310 16:06:05.066011 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" podStartSLOduration=3.263268699 podStartE2EDuration="15.065996345s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:51.503974661 +0000 UTC m=+1048.625840348" lastFinishedPulling="2026-03-10 16:06:03.306702307 +0000 UTC m=+1060.428567994" observedRunningTime="2026-03-10 16:06:05.061685575 +0000 UTC m=+1062.183551262" watchObservedRunningTime="2026-03-10 16:06:05.065996345 +0000 UTC m=+1062.187862032" Mar 10 16:06:05 crc kubenswrapper[4749]: I0310 16:06:05.067793 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" podStartSLOduration=3.8042563339999997 podStartE2EDuration="15.067787024s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.102255201 +0000 UTC m=+1049.224120888" lastFinishedPulling="2026-03-10 16:06:03.365785901 +0000 UTC m=+1060.487651578" observedRunningTime="2026-03-10 16:06:05.049523899 +0000 UTC m=+1062.171389586" watchObservedRunningTime="2026-03-10 16:06:05.067787024 +0000 UTC m=+1062.189652701" Mar 10 16:06:05 crc kubenswrapper[4749]: I0310 16:06:05.088285 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" podStartSLOduration=3.762156801 podStartE2EDuration="15.088268061s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.083933555 +0000 UTC m=+1049.205799242" lastFinishedPulling="2026-03-10 16:06:03.410044815 +0000 UTC m=+1060.531910502" observedRunningTime="2026-03-10 16:06:05.085232347 +0000 UTC m=+1062.207098044" watchObservedRunningTime="2026-03-10 16:06:05.088268061 +0000 UTC m=+1062.210133748" Mar 10 16:06:05 crc kubenswrapper[4749]: I0310 16:06:05.856348 4749 generic.go:334] "Generic (PLEG): container finished" podID="31d12a68-0f40-4da5-8662-2228ed4812e4" containerID="1ad07119a18cdab9ff4030b9d94602e3bbe4d2e3ab6cd5604847ef802f7dbd2c" exitCode=0 Mar 10 16:06:05 crc kubenswrapper[4749]: I0310 16:06:05.856421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" event={"ID":"31d12a68-0f40-4da5-8662-2228ed4812e4","Type":"ContainerDied","Data":"1ad07119a18cdab9ff4030b9d94602e3bbe4d2e3ab6cd5604847ef802f7dbd2c"} Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.170065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.176878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4cb9d6b-00f0-478e-a275-2720e6f90e8a-cert\") pod \"infra-operator-controller-manager-5995f4446f-5r86d\" (UID: \"b4cb9d6b-00f0-478e-a275-2720e6f90e8a\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.450018 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.474865 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:06:06 crc kubenswrapper[4749]: E0310 16:06:06.475102 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:06:06 crc kubenswrapper[4749]: E0310 16:06:06.475189 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert podName:5a87391b-1b62-4214-ae0d-07c29e9e5efa nodeName:}" failed. No retries permitted until 2026-03-10 16:06:22.475168178 +0000 UTC m=+1079.597033865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" (UID: "5a87391b-1b62-4214-ae0d-07c29e9e5efa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.780193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.780944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:06 crc kubenswrapper[4749]: E0310 16:06:06.780425 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 16:06:06 crc kubenswrapper[4749]: E0310 16:06:06.781046 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:06:22.781013428 +0000 UTC m=+1079.902879115 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "webhook-server-cert" not found Mar 10 16:06:06 crc kubenswrapper[4749]: E0310 16:06:06.781140 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 16:06:06 crc kubenswrapper[4749]: E0310 16:06:06.781212 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs podName:281e49ea-bf93-4ad0-8081-eced425b1a7e nodeName:}" failed. No retries permitted until 2026-03-10 16:06:22.781193863 +0000 UTC m=+1079.903059610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-6rsrj" (UID: "281e49ea-bf93-4ad0-8081-eced425b1a7e") : secret "metrics-server-cert" not found Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.865624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" event={"ID":"40b9eefc-cf39-40d7-8f08-415714ea31d9","Type":"ContainerStarted","Data":"b3417bb936559dcebb46bca56bb0e9329b1246122f27287d4fc3038ffe6080fe"} Mar 10 16:06:06 crc kubenswrapper[4749]: I0310 16:06:06.893176 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" podStartSLOduration=2.654117968 podStartE2EDuration="16.89313471s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.447408969 +0000 UTC m=+1049.569274656" lastFinishedPulling="2026-03-10 16:06:06.686425711 +0000 UTC m=+1063.808291398" observedRunningTime="2026-03-10 16:06:06.879364389 +0000 UTC m=+1064.001230076" watchObservedRunningTime="2026-03-10 16:06:06.89313471 +0000 UTC m=+1064.015000397" Mar 10 16:06:07 crc kubenswrapper[4749]: I0310 16:06:07.120337 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d"] Mar 10 16:06:07 crc kubenswrapper[4749]: W0310 16:06:07.929401 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cb9d6b_00f0_478e_a275_2720e6f90e8a.slice/crio-846eb4759e5af956746c8afeadbe226b7a76fd2c68005569d1022559ad967a78 WatchSource:0}: Error finding container 846eb4759e5af956746c8afeadbe226b7a76fd2c68005569d1022559ad967a78: Status 404 returned error can't find the container with id 846eb4759e5af956746c8afeadbe226b7a76fd2c68005569d1022559ad967a78 Mar 10 16:06:07 crc kubenswrapper[4749]: I0310 16:06:07.989451 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.103880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbw5s\" (UniqueName: \"kubernetes.io/projected/31d12a68-0f40-4da5-8662-2228ed4812e4-kube-api-access-pbw5s\") pod \"31d12a68-0f40-4da5-8662-2228ed4812e4\" (UID: \"31d12a68-0f40-4da5-8662-2228ed4812e4\") " Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.111793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d12a68-0f40-4da5-8662-2228ed4812e4-kube-api-access-pbw5s" (OuterVolumeSpecName: "kube-api-access-pbw5s") pod "31d12a68-0f40-4da5-8662-2228ed4812e4" (UID: "31d12a68-0f40-4da5-8662-2228ed4812e4"). InnerVolumeSpecName "kube-api-access-pbw5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.206161 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbw5s\" (UniqueName: \"kubernetes.io/projected/31d12a68-0f40-4da5-8662-2228ed4812e4-kube-api-access-pbw5s\") on node \"crc\" DevicePath \"\"" Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.884666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" event={"ID":"31d12a68-0f40-4da5-8662-2228ed4812e4","Type":"ContainerDied","Data":"f6294a5faf1a136eca977f26a9aaeec760c9b5e5b48cd0999fb994d3a1269cb9"} Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.884736 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6294a5faf1a136eca977f26a9aaeec760c9b5e5b48cd0999fb994d3a1269cb9" Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.884738 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-fcx9s" Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.886337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" event={"ID":"57b4a19f-1a4b-4db9-8e25-fb3ed92e1388","Type":"ContainerStarted","Data":"6faeac8411a7817623082d78f8d462a85d3abafbadbce593e257fd3d4fe98d65"} Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.886531 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.888115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" event={"ID":"b4cb9d6b-00f0-478e-a275-2720e6f90e8a","Type":"ContainerStarted","Data":"846eb4759e5af956746c8afeadbe226b7a76fd2c68005569d1022559ad967a78"} Mar 10 16:06:08 crc kubenswrapper[4749]: I0310 16:06:08.903846 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" podStartSLOduration=3.288746893 podStartE2EDuration="18.903826613s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.393849807 +0000 UTC m=+1049.515715494" lastFinishedPulling="2026-03-10 16:06:08.008929517 +0000 UTC m=+1065.130795214" observedRunningTime="2026-03-10 16:06:08.902538427 +0000 UTC m=+1066.024404114" watchObservedRunningTime="2026-03-10 16:06:08.903826613 +0000 UTC m=+1066.025692300" Mar 10 16:06:09 crc kubenswrapper[4749]: I0310 16:06:09.041533 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-jlz6h"] Mar 10 16:06:09 crc kubenswrapper[4749]: I0310 16:06:09.046457 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-jlz6h"] Mar 10 16:06:09 crc kubenswrapper[4749]: I0310 16:06:09.629847 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29452a9-eba1-4780-b94c-72b02ca17315" path="/var/lib/kubelet/pods/d29452a9-eba1-4780-b94c-72b02ca17315/volumes" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.417887 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cl6tb" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.442700 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-w9j99" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.445586 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-cmqsz" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.483267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-rk5qv" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.528035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fcz88" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.575342 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-nt44l" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.621662 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-94s5n" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.646507 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-5spgx" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.669635 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rdtpp" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.801514 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8wbhh" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.846267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-986pw" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.878484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-kgb5j" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.879488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-gcgcm" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.936566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:06:10 crc kubenswrapper[4749]: I0310 16:06:10.994158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-v6kk4" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.218269 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fqvkq" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.927575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" event={"ID":"f35968c8-813f-473a-9bfc-46a3ff38318e","Type":"ContainerStarted","Data":"e0172eaa7aeb192317d593b2b305546fb813900255908aa4e786e2d8466f7515"} Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.930701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" event={"ID":"b4cb9d6b-00f0-478e-a275-2720e6f90e8a","Type":"ContainerStarted","Data":"900b3fbf62ab9eb131b1b542a6f8837070f8e94017be4a00d2352c21ce6cd480"} Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.930897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.933193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" event={"ID":"680e1c04-e829-45a0-a323-4d40ec62b076","Type":"ContainerStarted","Data":"5f053eb61a6b5c0e54296e57c87413d31a841a5880089c9ba9619106b0ece615"} Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.933589 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.936649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" event={"ID":"369202bf-81ec-4cf9-8540-c6a05a2447aa","Type":"ContainerStarted","Data":"51a69d5d949b2e301d4bcd590bfe67cdb8e6fd86bd2f6b2b458a46698ac41b40"} Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.937167 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.951648 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l8vqg" podStartSLOduration=2.696045087 podStartE2EDuration="21.951611976s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.37154033 +0000 UTC m=+1049.493406017" lastFinishedPulling="2026-03-10 16:06:11.627107229 +0000 UTC m=+1068.748972906" observedRunningTime="2026-03-10 16:06:11.940242822 +0000 UTC m=+1069.062108509" watchObservedRunningTime="2026-03-10 16:06:11.951611976 +0000 UTC m=+1069.073477663" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.979863 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" podStartSLOduration=18.244409942 podStartE2EDuration="21.979840917s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:06:07.931851335 +0000 UTC m=+1065.053717022" lastFinishedPulling="2026-03-10 16:06:11.66728231 +0000 UTC m=+1068.789147997" observedRunningTime="2026-03-10 16:06:11.964663607 +0000 UTC m=+1069.086529294" watchObservedRunningTime="2026-03-10 16:06:11.979840917 +0000 UTC m=+1069.101706604" Mar 10 16:06:11 crc kubenswrapper[4749]: I0310 16:06:11.993139 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" podStartSLOduration=2.785455859 podStartE2EDuration="21.993109923s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.481147712 +0000 UTC m=+1049.603013399" lastFinishedPulling="2026-03-10 16:06:11.688801766 +0000 UTC m=+1068.810667463" observedRunningTime="2026-03-10 16:06:11.984512235 +0000 UTC m=+1069.106377922" watchObservedRunningTime="2026-03-10 16:06:11.993109923 +0000 UTC m=+1069.114975620" Mar 10 16:06:12 crc kubenswrapper[4749]: I0310 16:06:12.000579 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" podStartSLOduration=2.467113574 podStartE2EDuration="22.00055481s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:05:52.093672113 +0000 UTC m=+1049.215537800" lastFinishedPulling="2026-03-10 16:06:11.627113349 +0000 UTC m=+1068.748979036" observedRunningTime="2026-03-10 16:06:11.998763551 +0000 UTC m=+1069.120629238" watchObservedRunningTime="2026-03-10 16:06:12.00055481 +0000 UTC m=+1069.122420507" Mar 10 16:06:16 crc kubenswrapper[4749]: I0310 16:06:16.462162 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" Mar 10 16:06:20 crc kubenswrapper[4749]: I0310 16:06:20.866072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-mwm5j" Mar 10 16:06:20 crc kubenswrapper[4749]: I0310 16:06:20.939545 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-7kfw8" Mar 10 16:06:20 crc kubenswrapper[4749]: I0310 16:06:20.980230 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:06:20 crc kubenswrapper[4749]: I0310 16:06:20.980293 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:06:21 crc kubenswrapper[4749]: I0310 16:06:21.183505 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-tzm5k" Mar 10 16:06:21 crc kubenswrapper[4749]: I0310 16:06:21.190927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xrkmh" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.542077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.552987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a87391b-1b62-4214-ae0d-07c29e9e5efa-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885ft9ddh\" (UID: \"5a87391b-1b62-4214-ae0d-07c29e9e5efa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.704777 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.846874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.847364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.851302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.851521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/281e49ea-bf93-4ad0-8081-eced425b1a7e-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-6rsrj\" (UID: \"281e49ea-bf93-4ad0-8081-eced425b1a7e\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:22 crc kubenswrapper[4749]: I0310 16:06:22.953251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh"] Mar 10 16:06:22 crc kubenswrapper[4749]: W0310 16:06:22.953842 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a87391b_1b62_4214_ae0d_07c29e9e5efa.slice/crio-f81ab419b3720c6476bf6281c5716ef486b52f1a59a1cbbf343f846368026224 WatchSource:0}: Error finding container f81ab419b3720c6476bf6281c5716ef486b52f1a59a1cbbf343f846368026224: Status 404 returned error can't find the container with id f81ab419b3720c6476bf6281c5716ef486b52f1a59a1cbbf343f846368026224 Mar 10 16:06:23 crc kubenswrapper[4749]: I0310 16:06:23.018990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" event={"ID":"5a87391b-1b62-4214-ae0d-07c29e9e5efa","Type":"ContainerStarted","Data":"f81ab419b3720c6476bf6281c5716ef486b52f1a59a1cbbf343f846368026224"} Mar 10 16:06:23 crc kubenswrapper[4749]: I0310 16:06:23.025900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:23 crc kubenswrapper[4749]: I0310 16:06:23.278122 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj"] Mar 10 16:06:23 crc kubenswrapper[4749]: W0310 16:06:23.282749 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281e49ea_bf93_4ad0_8081_eced425b1a7e.slice/crio-1f7e5bd14b32cadd6d42373c7daa10d4e2ca47db941645f74e70dd7d25999db9 WatchSource:0}: Error finding container 1f7e5bd14b32cadd6d42373c7daa10d4e2ca47db941645f74e70dd7d25999db9: Status 404 returned error can't find the container with id 1f7e5bd14b32cadd6d42373c7daa10d4e2ca47db941645f74e70dd7d25999db9 Mar 10 16:06:24 crc kubenswrapper[4749]: I0310 16:06:24.026785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" event={"ID":"281e49ea-bf93-4ad0-8081-eced425b1a7e","Type":"ContainerStarted","Data":"1f7e5bd14b32cadd6d42373c7daa10d4e2ca47db941645f74e70dd7d25999db9"} Mar 10 16:06:25 crc kubenswrapper[4749]: I0310 16:06:25.040776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" event={"ID":"281e49ea-bf93-4ad0-8081-eced425b1a7e","Type":"ContainerStarted","Data":"625e2a92ce293ecaec728c7290c30eb4964c0dbe869ea76b0e7cf65410f235d2"} Mar 10 16:06:25 crc kubenswrapper[4749]: I0310 16:06:25.041134 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:25 crc kubenswrapper[4749]: I0310 16:06:25.090846 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" podStartSLOduration=35.090821524 podStartE2EDuration="35.090821524s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:06:25.077969788 +0000 UTC m=+1082.199835475" watchObservedRunningTime="2026-03-10 16:06:25.090821524 +0000 UTC m=+1082.212687211" Mar 10 16:06:29 crc kubenswrapper[4749]: I0310 16:06:29.073875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" event={"ID":"5a87391b-1b62-4214-ae0d-07c29e9e5efa","Type":"ContainerStarted","Data":"cb342b955101670988f26c7e881d564b2e50c58a9ab93d5d3c577d3f27796157"} Mar 10 16:06:29 crc kubenswrapper[4749]: I0310 16:06:29.074475 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:06:29 crc kubenswrapper[4749]: I0310 16:06:29.164626 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" podStartSLOduration=33.608114055 podStartE2EDuration="39.164611209s" podCreationTimestamp="2026-03-10 16:05:50 +0000 UTC" firstStartedPulling="2026-03-10 16:06:22.956588573 +0000 UTC m=+1080.078454270" lastFinishedPulling="2026-03-10 16:06:28.513085737 +0000 UTC m=+1085.634951424" observedRunningTime="2026-03-10 16:06:29.160186198 +0000 UTC m=+1086.282051895" watchObservedRunningTime="2026-03-10 16:06:29.164611209 +0000 UTC m=+1086.286476896" Mar 10 16:06:33 crc kubenswrapper[4749]: I0310 16:06:33.040023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-6rsrj" Mar 10 16:06:42 crc kubenswrapper[4749]: I0310 16:06:42.715747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885ft9ddh" Mar 10 16:06:50 crc kubenswrapper[4749]: I0310 16:06:50.980673 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:06:50 crc kubenswrapper[4749]: I0310 16:06:50.981232 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.757295 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wkhsj"] Mar 10 16:06:57 crc kubenswrapper[4749]: E0310 16:06:57.758067 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d12a68-0f40-4da5-8662-2228ed4812e4" containerName="oc" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.758084 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d12a68-0f40-4da5-8662-2228ed4812e4" containerName="oc" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.758263 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d12a68-0f40-4da5-8662-2228ed4812e4" containerName="oc" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.759171 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.760803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-p42gt" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.762065 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.762067 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.767342 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wkhsj"] Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.773003 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.803347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7bfw\" (UniqueName: \"kubernetes.io/projected/6e28067a-b739-459b-894c-7f53e6d080c4-kube-api-access-v7bfw\") pod \"dnsmasq-dns-589db6c89c-wkhsj\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.803425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e28067a-b739-459b-894c-7f53e6d080c4-config\") pod \"dnsmasq-dns-589db6c89c-wkhsj\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.853107 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-xdflz"] Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.854656 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.857936 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.888452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-xdflz"] Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.905065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.905152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-config\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.905226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7bfw\" (UniqueName: \"kubernetes.io/projected/6e28067a-b739-459b-894c-7f53e6d080c4-kube-api-access-v7bfw\") pod \"dnsmasq-dns-589db6c89c-wkhsj\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.905260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e28067a-b739-459b-894c-7f53e6d080c4-config\") pod \"dnsmasq-dns-589db6c89c-wkhsj\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.905321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blzm8\" (UniqueName: \"kubernetes.io/projected/624b4ba6-f948-46e4-8300-f4f9a4c52011-kube-api-access-blzm8\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.906595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e28067a-b739-459b-894c-7f53e6d080c4-config\") pod \"dnsmasq-dns-589db6c89c-wkhsj\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:57 crc kubenswrapper[4749]: I0310 16:06:57.927648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7bfw\" (UniqueName: \"kubernetes.io/projected/6e28067a-b739-459b-894c-7f53e6d080c4-kube-api-access-v7bfw\") pod \"dnsmasq-dns-589db6c89c-wkhsj\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.006837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blzm8\" (UniqueName: \"kubernetes.io/projected/624b4ba6-f948-46e4-8300-f4f9a4c52011-kube-api-access-blzm8\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.006914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.006958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-config\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.007925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-config\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.008324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.025111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blzm8\" (UniqueName: \"kubernetes.io/projected/624b4ba6-f948-46e4-8300-f4f9a4c52011-kube-api-access-blzm8\") pod \"dnsmasq-dns-86bbd886cf-xdflz\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.078302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.185272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.412169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-xdflz"] Mar 10 16:06:58 crc kubenswrapper[4749]: I0310 16:06:58.510281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wkhsj"] Mar 10 16:06:58 crc kubenswrapper[4749]: W0310 16:06:58.516650 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e28067a_b739_459b_894c_7f53e6d080c4.slice/crio-784cfeaa3330ede48aac8c11dc9cbc624ab8b9647c0ead3bbef6a96fdabba1aa WatchSource:0}: Error finding container 784cfeaa3330ede48aac8c11dc9cbc624ab8b9647c0ead3bbef6a96fdabba1aa: Status 404 returned error can't find the container with id 784cfeaa3330ede48aac8c11dc9cbc624ab8b9647c0ead3bbef6a96fdabba1aa Mar 10 16:06:59 crc kubenswrapper[4749]: I0310 16:06:59.304805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" event={"ID":"6e28067a-b739-459b-894c-7f53e6d080c4","Type":"ContainerStarted","Data":"784cfeaa3330ede48aac8c11dc9cbc624ab8b9647c0ead3bbef6a96fdabba1aa"} Mar 10 16:06:59 crc kubenswrapper[4749]: I0310 16:06:59.306941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" event={"ID":"624b4ba6-f948-46e4-8300-f4f9a4c52011","Type":"ContainerStarted","Data":"78b45fbf6415181f000975cf044e4459565f5fe60352b203bfcff30d7ad2cd74"} Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.569785 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wkhsj"] Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.616342 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-8bpfs"] Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.619522 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.627220 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-8bpfs"] Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.666939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-config\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.667046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5mnw\" (UniqueName: \"kubernetes.io/projected/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-kube-api-access-k5mnw\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.667074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.768704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5mnw\" (UniqueName: \"kubernetes.io/projected/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-kube-api-access-k5mnw\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.768796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.768892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-config\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.770292 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-config\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.770498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.801472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5mnw\" (UniqueName: \"kubernetes.io/projected/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-kube-api-access-k5mnw\") pod \"dnsmasq-dns-79f9fc56ff-8bpfs\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.900105 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-xdflz"] Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.951617 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-x7462"] Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.953743 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.965090 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-x7462"] Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.965291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.971428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-config\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.971502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgx9\" (UniqueName: \"kubernetes.io/projected/ec39734d-eadc-4736-9eb5-98ad1c2a233c-kube-api-access-clgx9\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:00 crc kubenswrapper[4749]: I0310 16:07:00.971568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.072535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-config\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.072930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgx9\" (UniqueName: \"kubernetes.io/projected/ec39734d-eadc-4736-9eb5-98ad1c2a233c-kube-api-access-clgx9\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.073019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.074106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.074680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-config\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.115469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgx9\" (UniqueName: \"kubernetes.io/projected/ec39734d-eadc-4736-9eb5-98ad1c2a233c-kube-api-access-clgx9\") pod \"dnsmasq-dns-7c47bcb9f9-x7462\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.313583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.542232 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-8bpfs"] Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.768763 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.770198 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.777733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.777806 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.777805 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mbcgz" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.777832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.777733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.778928 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.779840 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.793665 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1feaa4c9-2cec-45a8-9106-5be885c26eae-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpfqg\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-kube-api-access-lpfqg\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1feaa4c9-2cec-45a8-9106-5be885c26eae-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.889979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.890019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.890047 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpfqg\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-kube-api-access-lpfqg\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1feaa4c9-2cec-45a8-9106-5be885c26eae-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991610 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.991862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1feaa4c9-2cec-45a8-9106-5be885c26eae-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.992303 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.995692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.995781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.997643 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.998534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:01 crc kubenswrapper[4749]: I0310 16:07:01.998581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.000014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1feaa4c9-2cec-45a8-9106-5be885c26eae-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.002395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.003240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.011885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1feaa4c9-2cec-45a8-9106-5be885c26eae-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.013253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpfqg\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-kube-api-access-lpfqg\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.020645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.096540 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.097702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.105450 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.105819 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-77589" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.106446 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.106600 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.106742 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.106854 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.107061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.117943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.121784 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.195712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.195771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.195795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.195950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnnrs\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-kube-api-access-wnnrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.196326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.297973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnnrs\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-kube-api-access-wnnrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298249 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.298616 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.299228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.299436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.300079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.300137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.300307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.303064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.303500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.304125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.315303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.316864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnnrs\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-kube-api-access-wnnrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.330360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:02 crc kubenswrapper[4749]: I0310 16:07:02.444347 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.213974 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.215386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.225238 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mgh8c" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.225706 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.225842 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.226610 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.233096 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.240352 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.313705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grgp7\" (UniqueName: \"kubernetes.io/projected/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kube-api-access-grgp7\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.313754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.313791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.313828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kolla-config\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.313855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.313877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.314028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.314105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-default\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.315572 4749 scope.go:117] "RemoveContainer" containerID="e3238c98207593cd21fd73b85c538bfa562bd50f0e54e0a2d2cc80dc34102b68" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.415939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kolla-config\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-default\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grgp7\" (UniqueName: \"kubernetes.io/projected/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kube-api-access-grgp7\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.416505 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.417172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.417544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kolla-config\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.418011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-default\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.418060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.424045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.438700 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.440127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grgp7\" (UniqueName: \"kubernetes.io/projected/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kube-api-access-grgp7\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.442164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " pod="openstack/openstack-galera-0" Mar 10 16:07:03 crc kubenswrapper[4749]: I0310 16:07:03.559664 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.621051 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.624028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.628190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.630581 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.636552 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jqmbv" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.636709 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.636887 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.748804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshzj\" (UniqueName: \"kubernetes.io/projected/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kube-api-access-wshzj\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.749714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: W0310 16:07:04.765733 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee9d039_f9c0_43ca_b415_0fc41f5a9522.slice/crio-f0e97fa3a3212e64d05b0dc0219f2d82c3e719de6e870fce54a0115b777bb7a9 WatchSource:0}: Error finding container f0e97fa3a3212e64d05b0dc0219f2d82c3e719de6e870fce54a0115b777bb7a9: Status 404 returned error can't find the container with id f0e97fa3a3212e64d05b0dc0219f2d82c3e719de6e870fce54a0115b777bb7a9 Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshzj\" (UniqueName: \"kubernetes.io/projected/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kube-api-access-wshzj\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.851330 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.852083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.852232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.852239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.852703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.855668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.855671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.869878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshzj\" (UniqueName: \"kubernetes.io/projected/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kube-api-access-wshzj\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.876816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.966147 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.967146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.967844 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.968935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.969204 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.969249 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r7pbr" Mar 10 16:07:04 crc kubenswrapper[4749]: I0310 16:07:04.979026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.054550 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-kolla-config\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.054615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-config-data\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.054656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqj7c\" (UniqueName: \"kubernetes.io/projected/ec710cfc-8539-47c5-8062-95911f973074-kube-api-access-zqj7c\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.054772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.054810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.156042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.156134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-kolla-config\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.156195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-config-data\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.156245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqj7c\" (UniqueName: \"kubernetes.io/projected/ec710cfc-8539-47c5-8062-95911f973074-kube-api-access-zqj7c\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.156314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.156950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-kolla-config\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.157121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-config-data\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.160398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.162226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.173845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqj7c\" (UniqueName: \"kubernetes.io/projected/ec710cfc-8539-47c5-8062-95911f973074-kube-api-access-zqj7c\") pod \"memcached-0\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.285971 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 16:07:05 crc kubenswrapper[4749]: I0310 16:07:05.397071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" event={"ID":"0ee9d039-f9c0-43ca-b415-0fc41f5a9522","Type":"ContainerStarted","Data":"f0e97fa3a3212e64d05b0dc0219f2d82c3e719de6e870fce54a0115b777bb7a9"} Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.234697 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.235879 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.243259 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ffm8n" Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.266781 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.287941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchbx\" (UniqueName: \"kubernetes.io/projected/0a876bab-aa64-429f-bcb8-7e644cc4f547-kube-api-access-lchbx\") pod \"kube-state-metrics-0\" (UID: \"0a876bab-aa64-429f-bcb8-7e644cc4f547\") " pod="openstack/kube-state-metrics-0" Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.389512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lchbx\" (UniqueName: \"kubernetes.io/projected/0a876bab-aa64-429f-bcb8-7e644cc4f547-kube-api-access-lchbx\") pod \"kube-state-metrics-0\" (UID: \"0a876bab-aa64-429f-bcb8-7e644cc4f547\") " pod="openstack/kube-state-metrics-0" Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.411195 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchbx\" (UniqueName: \"kubernetes.io/projected/0a876bab-aa64-429f-bcb8-7e644cc4f547-kube-api-access-lchbx\") pod \"kube-state-metrics-0\" (UID: \"0a876bab-aa64-429f-bcb8-7e644cc4f547\") " pod="openstack/kube-state-metrics-0" Mar 10 16:07:07 crc kubenswrapper[4749]: I0310 16:07:07.555867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.109636 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.111665 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.116136 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.116186 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.116311 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.116359 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bg5hk" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.116524 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.125523 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.225661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3814c41-600a-4463-9695-e55c293ffead-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.225717 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.225822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-config\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.225973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.226038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.226111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.226146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.226230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg5qm\" (UniqueName: \"kubernetes.io/projected/b3814c41-600a-4463-9695-e55c293ffead-kube-api-access-qg5qm\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.290704 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vms4g"] Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.292219 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.297620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vnzg2" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.299439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.299997 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.301642 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bd2hf"] Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.304607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.312746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vms4g"] Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.320314 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bd2hf"] Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.327903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg5qm\" (UniqueName: \"kubernetes.io/projected/b3814c41-600a-4463-9695-e55c293ffead-kube-api-access-qg5qm\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.327958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3814c41-600a-4463-9695-e55c293ffead-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3814c41-600a-4463-9695-e55c293ffead-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-config\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328427 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.328959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.329011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.330113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.331535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-config\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.349143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.354754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.355289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.363795 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.364101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg5qm\" (UniqueName: \"kubernetes.io/projected/b3814c41-600a-4463-9695-e55c293ffead-kube-api-access-qg5qm\") pod \"ovsdbserver-nb-0\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run-ovn\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431466 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431497 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrltg\" (UniqueName: \"kubernetes.io/projected/e03a8285-2164-42a8-8887-95bdaf021a73-kube-api-access-nrltg\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-lib\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03a8285-2164-42a8-8887-95bdaf021a73-scripts\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-etc-ovs\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-ovn-controller-tls-certs\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-log\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-combined-ca-bundle\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-log-ovn\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ghk\" (UniqueName: \"kubernetes.io/projected/0ad2c472-e0a5-43d7-971e-a242a578042b-kube-api-access-g6ghk\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad2c472-e0a5-43d7-971e-a242a578042b-scripts\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.431833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-run\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.447670 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-lib\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03a8285-2164-42a8-8887-95bdaf021a73-scripts\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-etc-ovs\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-ovn-controller-tls-certs\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-log\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-combined-ca-bundle\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-log-ovn\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ghk\" (UniqueName: \"kubernetes.io/projected/0ad2c472-e0a5-43d7-971e-a242a578042b-kube-api-access-g6ghk\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533483 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad2c472-e0a5-43d7-971e-a242a578042b-scripts\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-run\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run-ovn\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.533671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrltg\" (UniqueName: \"kubernetes.io/projected/e03a8285-2164-42a8-8887-95bdaf021a73-kube-api-access-nrltg\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.534055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-lib\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.534227 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-etc-ovs\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.536341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03a8285-2164-42a8-8887-95bdaf021a73-scripts\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.537013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad2c472-e0a5-43d7-971e-a242a578042b-scripts\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.537199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-run\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.537313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run-ovn\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.537406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.538419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-log\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.538552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-log-ovn\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.545015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-combined-ca-bundle\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.547628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-ovn-controller-tls-certs\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.559923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrltg\" (UniqueName: \"kubernetes.io/projected/e03a8285-2164-42a8-8887-95bdaf021a73-kube-api-access-nrltg\") pod \"ovn-controller-ovs-bd2hf\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.561296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ghk\" (UniqueName: \"kubernetes.io/projected/0ad2c472-e0a5-43d7-971e-a242a578042b-kube-api-access-g6ghk\") pod \"ovn-controller-vms4g\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.613326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g" Mar 10 16:07:12 crc kubenswrapper[4749]: I0310 16:07:12.638737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.189868 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.204630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.210919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.211460 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.212118 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7lf8p" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.213490 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.233987 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.352406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.352480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.352510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.352695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-config\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.352745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwshh\" (UniqueName: \"kubernetes.io/projected/99aedb1b-bca3-41ef-9399-4678f86ac87c-kube-api-access-lwshh\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.352996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.353164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.353197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-config\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwshh\" (UniqueName: \"kubernetes.io/projected/99aedb1b-bca3-41ef-9399-4678f86ac87c-kube-api-access-lwshh\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.455828 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.456524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.457112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-config\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.457623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.461835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.466185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.467423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.486027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwshh\" (UniqueName: \"kubernetes.io/projected/99aedb1b-bca3-41ef-9399-4678f86ac87c-kube-api-access-lwshh\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.492570 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.492896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: I0310 16:07:13.532259 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:13 crc kubenswrapper[4749]: E0310 16:07:13.946354 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 16:07:13 crc kubenswrapper[4749]: E0310 16:07:13.946804 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-blzm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-xdflz_openstack(624b4ba6-f948-46e4-8300-f4f9a4c52011): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 16:07:13 crc kubenswrapper[4749]: E0310 16:07:13.948561 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" podUID="624b4ba6-f948-46e4-8300-f4f9a4c52011" Mar 10 16:07:13 crc kubenswrapper[4749]: E0310 16:07:13.979854 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 16:07:13 crc kubenswrapper[4749]: E0310 16:07:13.980070 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7bfw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-wkhsj_openstack(6e28067a-b739-459b-894c-7f53e6d080c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 16:07:13 crc kubenswrapper[4749]: E0310 16:07:13.981278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" podUID="6e28067a-b739-459b-894c-7f53e6d080c4" Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.594868 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 16:07:14 crc kubenswrapper[4749]: W0310 16:07:14.616057 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb87bc1_b9a8_44e7_8603_ba656ef9e65c.slice/crio-f08fdf2117dc8e63c0bf505cbe956ff8f16bd161d742f808cead2769cc5da5c9 WatchSource:0}: Error finding container f08fdf2117dc8e63c0bf505cbe956ff8f16bd161d742f808cead2769cc5da5c9: Status 404 returned error can't find the container with id f08fdf2117dc8e63c0bf505cbe956ff8f16bd161d742f808cead2769cc5da5c9 Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.805734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.816579 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-x7462"] Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.823780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.834719 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 16:07:14 crc kubenswrapper[4749]: W0310 16:07:14.835398 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec39734d_eadc_4736_9eb5_98ad1c2a233c.slice/crio-6b5066527d0dc45ca8e6fe18f4edbfe8254cf6aa8c7b4a8ce9eb796b9b6c8d13 WatchSource:0}: Error finding container 6b5066527d0dc45ca8e6fe18f4edbfe8254cf6aa8c7b4a8ce9eb796b9b6c8d13: Status 404 returned error can't find the container with id 6b5066527d0dc45ca8e6fe18f4edbfe8254cf6aa8c7b4a8ce9eb796b9b6c8d13 Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.836906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3","Type":"ContainerStarted","Data":"5ba4d58986e74184358b23a5e61f3e47301a3c7f6dae3924926cf282d77f215d"} Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.838910 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerID="bc33204f19204f175773028da96d985c91c44540006d55786df62404707860ca" exitCode=0 Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.838949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" event={"ID":"0ee9d039-f9c0-43ca-b415-0fc41f5a9522","Type":"ContainerDied","Data":"bc33204f19204f175773028da96d985c91c44540006d55786df62404707860ca"} Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.840126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a876bab-aa64-429f-bcb8-7e644cc4f547","Type":"ContainerStarted","Data":"7ff5961e1ee2648f24fae15a4bb86fec514a10c7f59404769b8fb0d09e8ee815"} Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.841307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"feb87bc1-b9a8-44e7-8603-ba656ef9e65c","Type":"ContainerStarted","Data":"f08fdf2117dc8e63c0bf505cbe956ff8f16bd161d742f808cead2769cc5da5c9"} Mar 10 16:07:14 crc kubenswrapper[4749]: W0310 16:07:14.842062 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1feaa4c9_2cec_45a8_9106_5be885c26eae.slice/crio-810f01f8a2436913223fcdb81673d0300b3e2ff151414c944c97f8e18c53d3b3 WatchSource:0}: Error finding container 810f01f8a2436913223fcdb81673d0300b3e2ff151414c944c97f8e18c53d3b3: Status 404 returned error can't find the container with id 810f01f8a2436913223fcdb81673d0300b3e2ff151414c944c97f8e18c53d3b3 Mar 10 16:07:14 crc kubenswrapper[4749]: I0310 16:07:14.993178 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bd2hf"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.002626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vms4g"] Mar 10 16:07:15 crc kubenswrapper[4749]: W0310 16:07:15.019033 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad2c472_e0a5_43d7_971e_a242a578042b.slice/crio-a3b56753cfc0f188dd80527d7164c9264250815e4652e0fc6a9067ef00b12562 WatchSource:0}: Error finding container a3b56753cfc0f188dd80527d7164c9264250815e4652e0fc6a9067ef00b12562: Status 404 returned error can't find the container with id a3b56753cfc0f188dd80527d7164c9264250815e4652e0fc6a9067ef00b12562 Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.032561 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 16:07:15 crc kubenswrapper[4749]: E0310 16:07:15.055638 4749 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 10 16:07:15 crc kubenswrapper[4749]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/0ee9d039-f9c0-43ca-b415-0fc41f5a9522/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 16:07:15 crc kubenswrapper[4749]: > podSandboxID="f0e97fa3a3212e64d05b0dc0219f2d82c3e719de6e870fce54a0115b777bb7a9" Mar 10 16:07:15 crc kubenswrapper[4749]: E0310 16:07:15.055816 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 16:07:15 crc kubenswrapper[4749]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k5mnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-79f9fc56ff-8bpfs_openstack(0ee9d039-f9c0-43ca-b415-0fc41f5a9522): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/0ee9d039-f9c0-43ca-b415-0fc41f5a9522/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 16:07:15 crc kubenswrapper[4749]: > logger="UnhandledError" Mar 10 16:07:15 crc kubenswrapper[4749]: E0310 16:07:15.056949 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/0ee9d039-f9c0-43ca-b415-0fc41f5a9522/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.114027 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.200430 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.279218 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-wptw6"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.281644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.283883 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.285935 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wptw6"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.293135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e28067a-b739-459b-894c-7f53e6d080c4-config\") pod \"6e28067a-b739-459b-894c-7f53e6d080c4\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.293285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7bfw\" (UniqueName: \"kubernetes.io/projected/6e28067a-b739-459b-894c-7f53e6d080c4-kube-api-access-v7bfw\") pod \"6e28067a-b739-459b-894c-7f53e6d080c4\" (UID: \"6e28067a-b739-459b-894c-7f53e6d080c4\") " Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.295845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e28067a-b739-459b-894c-7f53e6d080c4-config" (OuterVolumeSpecName: "config") pod "6e28067a-b739-459b-894c-7f53e6d080c4" (UID: "6e28067a-b739-459b-894c-7f53e6d080c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.309689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e28067a-b739-459b-894c-7f53e6d080c4-kube-api-access-v7bfw" (OuterVolumeSpecName: "kube-api-access-v7bfw") pod "6e28067a-b739-459b-894c-7f53e6d080c4" (UID: "6e28067a-b739-459b-894c-7f53e6d080c4"). InnerVolumeSpecName "kube-api-access-v7bfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.317705 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.394629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blzm8\" (UniqueName: \"kubernetes.io/projected/624b4ba6-f948-46e4-8300-f4f9a4c52011-kube-api-access-blzm8\") pod \"624b4ba6-f948-46e4-8300-f4f9a4c52011\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.394698 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-config\") pod \"624b4ba6-f948-46e4-8300-f4f9a4c52011\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.394851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-dns-svc\") pod \"624b4ba6-f948-46e4-8300-f4f9a4c52011\" (UID: \"624b4ba6-f948-46e4-8300-f4f9a4c52011\") " Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0229a2-b07d-4baa-8b4c-a1c356e38679-config\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovs-rundir\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99t7\" (UniqueName: \"kubernetes.io/projected/8a0229a2-b07d-4baa-8b4c-a1c356e38679-kube-api-access-x99t7\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-combined-ca-bundle\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovn-rundir\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395292 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7bfw\" (UniqueName: \"kubernetes.io/projected/6e28067a-b739-459b-894c-7f53e6d080c4-kube-api-access-v7bfw\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395305 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e28067a-b739-459b-894c-7f53e6d080c4-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.395833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "624b4ba6-f948-46e4-8300-f4f9a4c52011" (UID: "624b4ba6-f948-46e4-8300-f4f9a4c52011"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.396690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-config" (OuterVolumeSpecName: "config") pod "624b4ba6-f948-46e4-8300-f4f9a4c52011" (UID: "624b4ba6-f948-46e4-8300-f4f9a4c52011"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.398423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624b4ba6-f948-46e4-8300-f4f9a4c52011-kube-api-access-blzm8" (OuterVolumeSpecName: "kube-api-access-blzm8") pod "624b4ba6-f948-46e4-8300-f4f9a4c52011" (UID: "624b4ba6-f948-46e4-8300-f4f9a4c52011"). InnerVolumeSpecName "kube-api-access-blzm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.490467 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-8bpfs"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.497911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0229a2-b07d-4baa-8b4c-a1c356e38679-config\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.497972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovs-rundir\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.497993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99t7\" (UniqueName: \"kubernetes.io/projected/8a0229a2-b07d-4baa-8b4c-a1c356e38679-kube-api-access-x99t7\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-combined-ca-bundle\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovn-rundir\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498139 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498151 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blzm8\" (UniqueName: \"kubernetes.io/projected/624b4ba6-f948-46e4-8300-f4f9a4c52011-kube-api-access-blzm8\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498164 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624b4ba6-f948-46e4-8300-f4f9a4c52011-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.498430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovn-rundir\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.499636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovs-rundir\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.501030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0229a2-b07d-4baa-8b4c-a1c356e38679-config\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.506062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.521230 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-combined-ca-bundle\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.521226 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-npnn8"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.523189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.528723 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.542263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99t7\" (UniqueName: \"kubernetes.io/projected/8a0229a2-b07d-4baa-8b4c-a1c356e38679-kube-api-access-x99t7\") pod \"ovn-controller-metrics-wptw6\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.542293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-npnn8"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.599756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wsfm\" (UniqueName: \"kubernetes.io/projected/42409815-af6b-4ba3-b834-e38bbd6035b2-kube-api-access-2wsfm\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.599810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-config\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.599832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.599937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-dns-svc\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.633248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.656572 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.703689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-dns-svc\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.703782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wsfm\" (UniqueName: \"kubernetes.io/projected/42409815-af6b-4ba3-b834-e38bbd6035b2-kube-api-access-2wsfm\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.703864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-config\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.703882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.706514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-config\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.706562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.707012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-dns-svc\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.733364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wsfm\" (UniqueName: \"kubernetes.io/projected/42409815-af6b-4ba3-b834-e38bbd6035b2-kube-api-access-2wsfm\") pod \"dnsmasq-dns-86dbfc8fbf-npnn8\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.856354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"99aedb1b-bca3-41ef-9399-4678f86ac87c","Type":"ContainerStarted","Data":"bd7b870a906040d745505efde228d65a6fa5f2974b6032afdb234dab118d0aa3"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.860408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ec710cfc-8539-47c5-8062-95911f973074","Type":"ContainerStarted","Data":"3d7973dba48a4eaddfd69e15d25cfcd7275ce0526e923e2fae0f1ad5997b7d31"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.862245 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.862311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-xdflz" event={"ID":"624b4ba6-f948-46e4-8300-f4f9a4c52011","Type":"ContainerDied","Data":"78b45fbf6415181f000975cf044e4459565f5fe60352b203bfcff30d7ad2cd74"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.865132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b3814c41-600a-4463-9695-e55c293ffead","Type":"ContainerStarted","Data":"b89cd3e50cda5065b116740ac396cc4ed54a24a5d1be39eeb02a4c343838d19d"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.867408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g" event={"ID":"0ad2c472-e0a5-43d7-971e-a242a578042b","Type":"ContainerStarted","Data":"a3b56753cfc0f188dd80527d7164c9264250815e4652e0fc6a9067ef00b12562"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.879298 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerID="9a063bf958dbfe07cdf4e542dce5d228c6e9130c53106cb68a3a0e9da9d593c8" exitCode=0 Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.879481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" event={"ID":"ec39734d-eadc-4736-9eb5-98ad1c2a233c","Type":"ContainerDied","Data":"9a063bf958dbfe07cdf4e542dce5d228c6e9130c53106cb68a3a0e9da9d593c8"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.879526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" event={"ID":"ec39734d-eadc-4736-9eb5-98ad1c2a233c","Type":"ContainerStarted","Data":"6b5066527d0dc45ca8e6fe18f4edbfe8254cf6aa8c7b4a8ce9eb796b9b6c8d13"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.890832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.896657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1feaa4c9-2cec-45a8-9106-5be885c26eae","Type":"ContainerStarted","Data":"810f01f8a2436913223fcdb81673d0300b3e2ff151414c944c97f8e18c53d3b3"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.909164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerStarted","Data":"0d9b95003cfa55cb42355742a8b55d563f4484029b151a5419fa30d0458a203a"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.926959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2bf7c072-7f7d-4f94-98a5-023b069f0eab","Type":"ContainerStarted","Data":"3619a258f327445043c95e77610d928b2d22f732084461c84d8732ffa496a5b9"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.928519 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.928575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-wkhsj" event={"ID":"6e28067a-b739-459b-894c-7f53e6d080c4","Type":"ContainerDied","Data":"784cfeaa3330ede48aac8c11dc9cbc624ab8b9647c0ead3bbef6a96fdabba1aa"} Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.932330 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-xdflz"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.942766 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-xdflz"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.991214 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wkhsj"] Mar 10 16:07:15 crc kubenswrapper[4749]: I0310 16:07:15.997290 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wkhsj"] Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.149979 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-wptw6"] Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.519281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-npnn8"] Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.940338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" event={"ID":"0ee9d039-f9c0-43ca-b415-0fc41f5a9522","Type":"ContainerStarted","Data":"6529eb93cff88b72717ce5e7f6fd6a3c9e1d1654185bae06ea3fa738b0901a47"} Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.940751 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.942479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" event={"ID":"ec39734d-eadc-4736-9eb5-98ad1c2a233c","Type":"ContainerStarted","Data":"2c4bb63f381069f7f5578676b40d32aa5572ac4ee8e5551e30910261f94f0589"} Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.943107 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.944789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wptw6" event={"ID":"8a0229a2-b07d-4baa-8b4c-a1c356e38679","Type":"ContainerStarted","Data":"3296c78784ffeff1936d437132fa977e27a916744d5029bf253445b5784c62ce"} Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.945482 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerName="dnsmasq-dns" containerID="cri-o://6529eb93cff88b72717ce5e7f6fd6a3c9e1d1654185bae06ea3fa738b0901a47" gracePeriod=10 Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.961571 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" podStartSLOduration=7.544568285 podStartE2EDuration="16.961554172s" podCreationTimestamp="2026-03-10 16:07:00 +0000 UTC" firstStartedPulling="2026-03-10 16:07:04.777568374 +0000 UTC m=+1121.899434061" lastFinishedPulling="2026-03-10 16:07:14.194554261 +0000 UTC m=+1131.316419948" observedRunningTime="2026-03-10 16:07:16.95570415 +0000 UTC m=+1134.077569847" watchObservedRunningTime="2026-03-10 16:07:16.961554172 +0000 UTC m=+1134.083419859" Mar 10 16:07:16 crc kubenswrapper[4749]: I0310 16:07:16.976175 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" podStartSLOduration=16.976151736 podStartE2EDuration="16.976151736s" podCreationTimestamp="2026-03-10 16:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:07:16.973985865 +0000 UTC m=+1134.095851552" watchObservedRunningTime="2026-03-10 16:07:16.976151736 +0000 UTC m=+1134.098017443" Mar 10 16:07:17 crc kubenswrapper[4749]: W0310 16:07:17.205534 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42409815_af6b_4ba3_b834_e38bbd6035b2.slice/crio-a5ced73d8eb9a3dc37e6cc4d7c89ec5ee0b630d92980a2e4add8feceac5ce1c4 WatchSource:0}: Error finding container a5ced73d8eb9a3dc37e6cc4d7c89ec5ee0b630d92980a2e4add8feceac5ce1c4: Status 404 returned error can't find the container with id a5ced73d8eb9a3dc37e6cc4d7c89ec5ee0b630d92980a2e4add8feceac5ce1c4 Mar 10 16:07:17 crc kubenswrapper[4749]: I0310 16:07:17.617545 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624b4ba6-f948-46e4-8300-f4f9a4c52011" path="/var/lib/kubelet/pods/624b4ba6-f948-46e4-8300-f4f9a4c52011/volumes" Mar 10 16:07:17 crc kubenswrapper[4749]: I0310 16:07:17.618133 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e28067a-b739-459b-894c-7f53e6d080c4" path="/var/lib/kubelet/pods/6e28067a-b739-459b-894c-7f53e6d080c4/volumes" Mar 10 16:07:17 crc kubenswrapper[4749]: I0310 16:07:17.953531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" event={"ID":"42409815-af6b-4ba3-b834-e38bbd6035b2","Type":"ContainerStarted","Data":"a5ced73d8eb9a3dc37e6cc4d7c89ec5ee0b630d92980a2e4add8feceac5ce1c4"} Mar 10 16:07:17 crc kubenswrapper[4749]: I0310 16:07:17.957489 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerID="6529eb93cff88b72717ce5e7f6fd6a3c9e1d1654185bae06ea3fa738b0901a47" exitCode=0 Mar 10 16:07:17 crc kubenswrapper[4749]: I0310 16:07:17.957554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" event={"ID":"0ee9d039-f9c0-43ca-b415-0fc41f5a9522","Type":"ContainerDied","Data":"6529eb93cff88b72717ce5e7f6fd6a3c9e1d1654185bae06ea3fa738b0901a47"} Mar 10 16:07:20 crc kubenswrapper[4749]: I0310 16:07:20.980999 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:07:20 crc kubenswrapper[4749]: I0310 16:07:20.981647 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:07:20 crc kubenswrapper[4749]: I0310 16:07:20.981697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:07:20 crc kubenswrapper[4749]: I0310 16:07:20.982477 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8df1beddcbbe4b28bedf74a692eb90fcfbb0b66981e27d81e53ac5b8485c3d4f"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:07:20 crc kubenswrapper[4749]: I0310 16:07:20.982529 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://8df1beddcbbe4b28bedf74a692eb90fcfbb0b66981e27d81e53ac5b8485c3d4f" gracePeriod=600 Mar 10 16:07:21 crc kubenswrapper[4749]: I0310 16:07:21.316487 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:21 crc kubenswrapper[4749]: I0310 16:07:21.994036 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="8df1beddcbbe4b28bedf74a692eb90fcfbb0b66981e27d81e53ac5b8485c3d4f" exitCode=0 Mar 10 16:07:21 crc kubenswrapper[4749]: I0310 16:07:21.994081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"8df1beddcbbe4b28bedf74a692eb90fcfbb0b66981e27d81e53ac5b8485c3d4f"} Mar 10 16:07:21 crc kubenswrapper[4749]: I0310 16:07:21.994111 4749 scope.go:117] "RemoveContainer" containerID="d4ea85a6b744107fd1b757efd6ea6aed1ac10e45ac86a77df1413fc6180c0184" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.159894 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.286829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5mnw\" (UniqueName: \"kubernetes.io/projected/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-kube-api-access-k5mnw\") pod \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.286930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-config\") pod \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.287022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-dns-svc\") pod \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\" (UID: \"0ee9d039-f9c0-43ca-b415-0fc41f5a9522\") " Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.290809 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-kube-api-access-k5mnw" (OuterVolumeSpecName: "kube-api-access-k5mnw") pod "0ee9d039-f9c0-43ca-b415-0fc41f5a9522" (UID: "0ee9d039-f9c0-43ca-b415-0fc41f5a9522"). InnerVolumeSpecName "kube-api-access-k5mnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.326082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ee9d039-f9c0-43ca-b415-0fc41f5a9522" (UID: "0ee9d039-f9c0-43ca-b415-0fc41f5a9522"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.333543 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-config" (OuterVolumeSpecName: "config") pod "0ee9d039-f9c0-43ca-b415-0fc41f5a9522" (UID: "0ee9d039-f9c0-43ca-b415-0fc41f5a9522"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.389888 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5mnw\" (UniqueName: \"kubernetes.io/projected/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-kube-api-access-k5mnw\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.389922 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:25 crc kubenswrapper[4749]: I0310 16:07:25.389934 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee9d039-f9c0-43ca-b415-0fc41f5a9522-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:26 crc kubenswrapper[4749]: I0310 16:07:26.028085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" event={"ID":"0ee9d039-f9c0-43ca-b415-0fc41f5a9522","Type":"ContainerDied","Data":"f0e97fa3a3212e64d05b0dc0219f2d82c3e719de6e870fce54a0115b777bb7a9"} Mar 10 16:07:26 crc kubenswrapper[4749]: I0310 16:07:26.028586 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-8bpfs" Mar 10 16:07:26 crc kubenswrapper[4749]: I0310 16:07:26.051929 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-8bpfs"] Mar 10 16:07:26 crc kubenswrapper[4749]: I0310 16:07:26.057808 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-8bpfs"] Mar 10 16:07:26 crc kubenswrapper[4749]: I0310 16:07:26.560290 4749 scope.go:117] "RemoveContainer" containerID="6529eb93cff88b72717ce5e7f6fd6a3c9e1d1654185bae06ea3fa738b0901a47" Mar 10 16:07:26 crc kubenswrapper[4749]: I0310 16:07:26.771711 4749 scope.go:117] "RemoveContainer" containerID="bc33204f19204f175773028da96d985c91c44540006d55786df62404707860ca" Mar 10 16:07:27 crc kubenswrapper[4749]: I0310 16:07:27.041892 4749 generic.go:334] "Generic (PLEG): container finished" podID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerID="41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538" exitCode=0 Mar 10 16:07:27 crc kubenswrapper[4749]: I0310 16:07:27.041982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" event={"ID":"42409815-af6b-4ba3-b834-e38bbd6035b2","Type":"ContainerDied","Data":"41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538"} Mar 10 16:07:27 crc kubenswrapper[4749]: I0310 16:07:27.044884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"106da756b634d444f1a07a98c656ecf91e046a9d0f74a54a7001a123a154d3af"} Mar 10 16:07:27 crc kubenswrapper[4749]: E0310 16:07:27.293028 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Mar 10 16:07:27 crc kubenswrapper[4749]: E0310 16:07:27.293428 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Mar 10 16:07:27 crc kubenswrapper[4749]: E0310 16:07:27.293570 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lchbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(0a876bab-aa64-429f-bcb8-7e644cc4f547): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 16:07:27 crc kubenswrapper[4749]: E0310 16:07:27.296565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" Mar 10 16:07:27 crc kubenswrapper[4749]: I0310 16:07:27.618758 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" path="/var/lib/kubelet/pods/0ee9d039-f9c0-43ca-b415-0fc41f5a9522/volumes" Mar 10 16:07:28 crc kubenswrapper[4749]: I0310 16:07:28.051987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ec710cfc-8539-47c5-8062-95911f973074","Type":"ContainerStarted","Data":"62c6ec9ae9969d0a788db39c30b51835114c25e683ecb1bdacaa7737d2e96d89"} Mar 10 16:07:28 crc kubenswrapper[4749]: I0310 16:07:28.053399 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 16:07:28 crc kubenswrapper[4749]: I0310 16:07:28.057270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g" event={"ID":"0ad2c472-e0a5-43d7-971e-a242a578042b","Type":"ContainerStarted","Data":"06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b"} Mar 10 16:07:28 crc kubenswrapper[4749]: E0310 16:07:28.058245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb\\\"\"" pod="openstack/kube-state-metrics-0" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" Mar 10 16:07:28 crc kubenswrapper[4749]: I0310 16:07:28.100109 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.882131241 podStartE2EDuration="24.100091698s" podCreationTimestamp="2026-03-10 16:07:04 +0000 UTC" firstStartedPulling="2026-03-10 16:07:14.852206155 +0000 UTC m=+1131.974071842" lastFinishedPulling="2026-03-10 16:07:25.070166612 +0000 UTC m=+1142.192032299" observedRunningTime="2026-03-10 16:07:28.078712196 +0000 UTC m=+1145.200577883" watchObservedRunningTime="2026-03-10 16:07:28.100091698 +0000 UTC m=+1145.221957385" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.069594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"99aedb1b-bca3-41ef-9399-4678f86ac87c","Type":"ContainerStarted","Data":"ab8c1ce0bf3c8cbe9d4fea8597af8d5906a17d426930351c4a8cef5fb3330560"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.070175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"99aedb1b-bca3-41ef-9399-4678f86ac87c","Type":"ContainerStarted","Data":"04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.073212 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2bf7c072-7f7d-4f94-98a5-023b069f0eab","Type":"ContainerStarted","Data":"1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.075595 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wptw6" event={"ID":"8a0229a2-b07d-4baa-8b4c-a1c356e38679","Type":"ContainerStarted","Data":"45eb6214436b2b1d33093e3aaa79629869ba461dde4208aab226427673052ba4"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.078204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b3814c41-600a-4463-9695-e55c293ffead","Type":"ContainerStarted","Data":"36ccb0b67d01f8e0c84de941ec69aa7a4955ba535082448c8b6fccd6ff57bdab"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.078241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b3814c41-600a-4463-9695-e55c293ffead","Type":"ContainerStarted","Data":"5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.080309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1feaa4c9-2cec-45a8-9106-5be885c26eae","Type":"ContainerStarted","Data":"367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.086319 4749 generic.go:334] "Generic (PLEG): container finished" podID="e03a8285-2164-42a8-8887-95bdaf021a73" containerID="073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834" exitCode=0 Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.086728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerDied","Data":"073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.091554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3","Type":"ContainerStarted","Data":"743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.095182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" event={"ID":"42409815-af6b-4ba3-b834-e38bbd6035b2","Type":"ContainerStarted","Data":"d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.095321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.099301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"feb87bc1-b9a8-44e7-8603-ba656ef9e65c","Type":"ContainerStarted","Data":"dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81"} Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.099341 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vms4g" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.100816 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.595410569 podStartE2EDuration="17.100794493s" podCreationTimestamp="2026-03-10 16:07:12 +0000 UTC" firstStartedPulling="2026-03-10 16:07:15.165714949 +0000 UTC m=+1132.287580636" lastFinishedPulling="2026-03-10 16:07:26.671098873 +0000 UTC m=+1143.792964560" observedRunningTime="2026-03-10 16:07:29.100165475 +0000 UTC m=+1146.222031182" watchObservedRunningTime="2026-03-10 16:07:29.100794493 +0000 UTC m=+1146.222660180" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.192033 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-wptw6" podStartSLOduration=3.114062588 podStartE2EDuration="14.192011857s" podCreationTimestamp="2026-03-10 16:07:15 +0000 UTC" firstStartedPulling="2026-03-10 16:07:16.204629791 +0000 UTC m=+1133.326495478" lastFinishedPulling="2026-03-10 16:07:27.28257906 +0000 UTC m=+1144.404444747" observedRunningTime="2026-03-10 16:07:29.186529795 +0000 UTC m=+1146.308395482" watchObservedRunningTime="2026-03-10 16:07:29.192011857 +0000 UTC m=+1146.313877544" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.212310 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vms4g" podStartSLOduration=5.877479963 podStartE2EDuration="17.212295778s" podCreationTimestamp="2026-03-10 16:07:12 +0000 UTC" firstStartedPulling="2026-03-10 16:07:15.022942709 +0000 UTC m=+1132.144808396" lastFinishedPulling="2026-03-10 16:07:26.357758524 +0000 UTC m=+1143.479624211" observedRunningTime="2026-03-10 16:07:29.210482867 +0000 UTC m=+1146.332348554" watchObservedRunningTime="2026-03-10 16:07:29.212295778 +0000 UTC m=+1146.334161465" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.234943 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.219501985 podStartE2EDuration="18.234926344s" podCreationTimestamp="2026-03-10 16:07:11 +0000 UTC" firstStartedPulling="2026-03-10 16:07:15.676821879 +0000 UTC m=+1132.798687566" lastFinishedPulling="2026-03-10 16:07:26.692246238 +0000 UTC m=+1143.814111925" observedRunningTime="2026-03-10 16:07:29.231491049 +0000 UTC m=+1146.353356736" watchObservedRunningTime="2026-03-10 16:07:29.234926344 +0000 UTC m=+1146.356792031" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.318681 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" podStartSLOduration=14.31866355 podStartE2EDuration="14.31866355s" podCreationTimestamp="2026-03-10 16:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:07:29.312038767 +0000 UTC m=+1146.433904454" watchObservedRunningTime="2026-03-10 16:07:29.31866355 +0000 UTC m=+1146.440529247" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.682492 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-npnn8"] Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.733477 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-stfb7"] Mar 10 16:07:29 crc kubenswrapper[4749]: E0310 16:07:29.733858 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerName="init" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.733885 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerName="init" Mar 10 16:07:29 crc kubenswrapper[4749]: E0310 16:07:29.733907 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerName="dnsmasq-dns" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.733915 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerName="dnsmasq-dns" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.734104 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee9d039-f9c0-43ca-b415-0fc41f5a9522" containerName="dnsmasq-dns" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.735085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.751580 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.791913 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-stfb7"] Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.855870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-config\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.855953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.856025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.856101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-dns-svc\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.856168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wwb\" (UniqueName: \"kubernetes.io/projected/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-kube-api-access-74wwb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.963704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wwb\" (UniqueName: \"kubernetes.io/projected/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-kube-api-access-74wwb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.963997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-config\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.964075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.964216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.964308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-dns-svc\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.965175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-nb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.965360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-config\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.965585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-dns-svc\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.965622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-sb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:29 crc kubenswrapper[4749]: I0310 16:07:29.992338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wwb\" (UniqueName: \"kubernetes.io/projected/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-kube-api-access-74wwb\") pod \"dnsmasq-dns-659ddb758c-stfb7\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:30 crc kubenswrapper[4749]: I0310 16:07:30.050048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:30 crc kubenswrapper[4749]: I0310 16:07:30.114283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerStarted","Data":"524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac"} Mar 10 16:07:30 crc kubenswrapper[4749]: I0310 16:07:30.114349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerStarted","Data":"091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f"} Mar 10 16:07:30 crc kubenswrapper[4749]: I0310 16:07:30.141692 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bd2hf" podStartSLOduration=6.79403297 podStartE2EDuration="18.14167128s" podCreationTimestamp="2026-03-10 16:07:12 +0000 UTC" firstStartedPulling="2026-03-10 16:07:15.010034161 +0000 UTC m=+1132.131899848" lastFinishedPulling="2026-03-10 16:07:26.357672461 +0000 UTC m=+1143.479538158" observedRunningTime="2026-03-10 16:07:30.135701815 +0000 UTC m=+1147.257567502" watchObservedRunningTime="2026-03-10 16:07:30.14167128 +0000 UTC m=+1147.263536967" Mar 10 16:07:30 crc kubenswrapper[4749]: I0310 16:07:30.448170 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:30 crc kubenswrapper[4749]: I0310 16:07:30.579825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-stfb7"] Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.122364 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerID="476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0" exitCode=0 Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.122487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" event={"ID":"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64","Type":"ContainerDied","Data":"476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0"} Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.122808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" event={"ID":"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64","Type":"ContainerStarted","Data":"f186d2c2d7c0f29e8d2ec182342467e180950ff2a28df3adeab2d11891fbba66"} Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.122822 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerName="dnsmasq-dns" containerID="cri-o://d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d" gracePeriod=10 Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.123232 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.123267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.510801 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.532629 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.577726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.701806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-config\") pod \"42409815-af6b-4ba3-b834-e38bbd6035b2\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.702313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-dns-svc\") pod \"42409815-af6b-4ba3-b834-e38bbd6035b2\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.702351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wsfm\" (UniqueName: \"kubernetes.io/projected/42409815-af6b-4ba3-b834-e38bbd6035b2-kube-api-access-2wsfm\") pod \"42409815-af6b-4ba3-b834-e38bbd6035b2\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.702456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-ovsdbserver-sb\") pod \"42409815-af6b-4ba3-b834-e38bbd6035b2\" (UID: \"42409815-af6b-4ba3-b834-e38bbd6035b2\") " Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.709306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42409815-af6b-4ba3-b834-e38bbd6035b2-kube-api-access-2wsfm" (OuterVolumeSpecName: "kube-api-access-2wsfm") pod "42409815-af6b-4ba3-b834-e38bbd6035b2" (UID: "42409815-af6b-4ba3-b834-e38bbd6035b2"). InnerVolumeSpecName "kube-api-access-2wsfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.745021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-config" (OuterVolumeSpecName: "config") pod "42409815-af6b-4ba3-b834-e38bbd6035b2" (UID: "42409815-af6b-4ba3-b834-e38bbd6035b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.745433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42409815-af6b-4ba3-b834-e38bbd6035b2" (UID: "42409815-af6b-4ba3-b834-e38bbd6035b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.751906 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42409815-af6b-4ba3-b834-e38bbd6035b2" (UID: "42409815-af6b-4ba3-b834-e38bbd6035b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.804438 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.804481 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.804494 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wsfm\" (UniqueName: \"kubernetes.io/projected/42409815-af6b-4ba3-b834-e38bbd6035b2-kube-api-access-2wsfm\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:31 crc kubenswrapper[4749]: I0310 16:07:31.804506 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42409815-af6b-4ba3-b834-e38bbd6035b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.131658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" event={"ID":"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64","Type":"ContainerStarted","Data":"ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d"} Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.133436 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.133947 4749 generic.go:334] "Generic (PLEG): container finished" podID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerID="d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d" exitCode=0 Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.134011 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.134036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" event={"ID":"42409815-af6b-4ba3-b834-e38bbd6035b2","Type":"ContainerDied","Data":"d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d"} Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.134062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbfc8fbf-npnn8" event={"ID":"42409815-af6b-4ba3-b834-e38bbd6035b2","Type":"ContainerDied","Data":"a5ced73d8eb9a3dc37e6cc4d7c89ec5ee0b630d92980a2e4add8feceac5ce1c4"} Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.134080 4749 scope.go:117] "RemoveContainer" containerID="d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.134451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.153419 4749 scope.go:117] "RemoveContainer" containerID="41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.162450 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" podStartSLOduration=3.162434455 podStartE2EDuration="3.162434455s" podCreationTimestamp="2026-03-10 16:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:07:32.156298085 +0000 UTC m=+1149.278163772" watchObservedRunningTime="2026-03-10 16:07:32.162434455 +0000 UTC m=+1149.284300142" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.175453 4749 scope.go:117] "RemoveContainer" containerID="d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d" Mar 10 16:07:32 crc kubenswrapper[4749]: E0310 16:07:32.175938 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d\": container with ID starting with d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d not found: ID does not exist" containerID="d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.175997 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d"} err="failed to get container status \"d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d\": rpc error: code = NotFound desc = could not find container \"d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d\": container with ID starting with d68e480038b29df256775b2ff1486b733715a50c64f670035b9ff1080bbb0d0d not found: ID does not exist" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.176026 4749 scope.go:117] "RemoveContainer" containerID="41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538" Mar 10 16:07:32 crc kubenswrapper[4749]: E0310 16:07:32.176405 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538\": container with ID starting with 41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538 not found: ID does not exist" containerID="41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.176434 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538"} err="failed to get container status \"41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538\": rpc error: code = NotFound desc = could not find container \"41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538\": container with ID starting with 41e645cc65e1d7616c8e533bd5959c2087d0fe758fedeea6ed16b8c2c40c2538 not found: ID does not exist" Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.178698 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-npnn8"] Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.186435 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86dbfc8fbf-npnn8"] Mar 10 16:07:32 crc kubenswrapper[4749]: I0310 16:07:32.448351 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.141783 4749 generic.go:334] "Generic (PLEG): container finished" podID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerID="1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5" exitCode=0 Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.141830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2bf7c072-7f7d-4f94-98a5-023b069f0eab","Type":"ContainerDied","Data":"1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5"} Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.145259 4749 generic.go:334] "Generic (PLEG): container finished" podID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerID="dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81" exitCode=0 Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.145414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"feb87bc1-b9a8-44e7-8603-ba656ef9e65c","Type":"ContainerDied","Data":"dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81"} Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.223011 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.508621 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.555670 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.617028 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" path="/var/lib/kubelet/pods/42409815-af6b-4ba3-b834-e38bbd6035b2/volumes" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.721088 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 16:07:33 crc kubenswrapper[4749]: E0310 16:07:33.721414 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerName="dnsmasq-dns" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.721429 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerName="dnsmasq-dns" Mar 10 16:07:33 crc kubenswrapper[4749]: E0310 16:07:33.721468 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerName="init" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.721478 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerName="init" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.721638 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42409815-af6b-4ba3-b834-e38bbd6035b2" containerName="dnsmasq-dns" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.722421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.727158 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.727162 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.727224 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.729313 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-b8f2v" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.742739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.742778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-scripts\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.742801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvbs2\" (UniqueName: \"kubernetes.io/projected/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-kube-api-access-tvbs2\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.742983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.743096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.743141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-config\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.743259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.743626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.844963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-scripts\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbs2\" (UniqueName: \"kubernetes.io/projected/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-kube-api-access-tvbs2\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-config\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.845456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.846409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-scripts\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.846547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-config\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.851104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.851231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.851341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:33 crc kubenswrapper[4749]: I0310 16:07:33.867993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvbs2\" (UniqueName: \"kubernetes.io/projected/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-kube-api-access-tvbs2\") pod \"ovn-northd-0\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " pod="openstack/ovn-northd-0" Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.036619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.194655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2bf7c072-7f7d-4f94-98a5-023b069f0eab","Type":"ContainerStarted","Data":"cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73"} Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.203856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"feb87bc1-b9a8-44e7-8603-ba656ef9e65c","Type":"ContainerStarted","Data":"0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f"} Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.219483 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.971041659 podStartE2EDuration="32.219458604s" podCreationTimestamp="2026-03-10 16:07:02 +0000 UTC" firstStartedPulling="2026-03-10 16:07:15.109489253 +0000 UTC m=+1132.231354940" lastFinishedPulling="2026-03-10 16:07:26.357906198 +0000 UTC m=+1143.479771885" observedRunningTime="2026-03-10 16:07:34.215801073 +0000 UTC m=+1151.337666760" watchObservedRunningTime="2026-03-10 16:07:34.219458604 +0000 UTC m=+1151.341324281" Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.265208 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.254311161 podStartE2EDuration="31.265164729s" podCreationTimestamp="2026-03-10 16:07:03 +0000 UTC" firstStartedPulling="2026-03-10 16:07:14.61923533 +0000 UTC m=+1131.741101017" lastFinishedPulling="2026-03-10 16:07:26.630088908 +0000 UTC m=+1143.751954585" observedRunningTime="2026-03-10 16:07:34.243066127 +0000 UTC m=+1151.364931824" watchObservedRunningTime="2026-03-10 16:07:34.265164729 +0000 UTC m=+1151.387030416" Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.544455 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 16:07:34 crc kubenswrapper[4749]: W0310 16:07:34.552729 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode794ff07_5e05_4d6c_8cc6_64efd90fd91b.slice/crio-0440e056788544c1f196ee262620191e228224c41ab8efd8625523c81eee7d1a WatchSource:0}: Error finding container 0440e056788544c1f196ee262620191e228224c41ab8efd8625523c81eee7d1a: Status 404 returned error can't find the container with id 0440e056788544c1f196ee262620191e228224c41ab8efd8625523c81eee7d1a Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.967950 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:34 crc kubenswrapper[4749]: I0310 16:07:34.968014 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:35 crc kubenswrapper[4749]: I0310 16:07:35.213462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e794ff07-5e05-4d6c-8cc6-64efd90fd91b","Type":"ContainerStarted","Data":"0440e056788544c1f196ee262620191e228224c41ab8efd8625523c81eee7d1a"} Mar 10 16:07:35 crc kubenswrapper[4749]: I0310 16:07:35.287610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 16:07:36 crc kubenswrapper[4749]: I0310 16:07:36.221252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e794ff07-5e05-4d6c-8cc6-64efd90fd91b","Type":"ContainerStarted","Data":"38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81"} Mar 10 16:07:36 crc kubenswrapper[4749]: I0310 16:07:36.221622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e794ff07-5e05-4d6c-8cc6-64efd90fd91b","Type":"ContainerStarted","Data":"ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf"} Mar 10 16:07:36 crc kubenswrapper[4749]: I0310 16:07:36.221641 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 16:07:36 crc kubenswrapper[4749]: I0310 16:07:36.245438 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.029356021 podStartE2EDuration="3.245418964s" podCreationTimestamp="2026-03-10 16:07:33 +0000 UTC" firstStartedPulling="2026-03-10 16:07:34.554873444 +0000 UTC m=+1151.676739161" lastFinishedPulling="2026-03-10 16:07:35.770936417 +0000 UTC m=+1152.892802104" observedRunningTime="2026-03-10 16:07:36.239793718 +0000 UTC m=+1153.361659405" watchObservedRunningTime="2026-03-10 16:07:36.245418964 +0000 UTC m=+1153.367284651" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.515970 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-stfb7"] Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.516408 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerName="dnsmasq-dns" containerID="cri-o://ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d" gracePeriod=10 Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.521251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.558435 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58df884995-tlc2v"] Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.559994 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.590783 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-tlc2v"] Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.663111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-dns-svc\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.663200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-config\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.663245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2phk4\" (UniqueName: \"kubernetes.io/projected/55c4b035-df08-431f-bee2-d02a4709086c-kube-api-access-2phk4\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.663265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.663338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.765020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.765150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.765190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-dns-svc\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.765244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-config\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.765300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2phk4\" (UniqueName: \"kubernetes.io/projected/55c4b035-df08-431f-bee2-d02a4709086c-kube-api-access-2phk4\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.765831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-nb\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.766317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-dns-svc\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.766536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-sb\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.766910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-config\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.805496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2phk4\" (UniqueName: \"kubernetes.io/projected/55c4b035-df08-431f-bee2-d02a4709086c-kube-api-access-2phk4\") pod \"dnsmasq-dns-58df884995-tlc2v\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.971574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:37 crc kubenswrapper[4749]: I0310 16:07:37.979728 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.068750 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-nb\") pod \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.068816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-dns-svc\") pod \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.068877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-sb\") pod \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.068899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74wwb\" (UniqueName: \"kubernetes.io/projected/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-kube-api-access-74wwb\") pod \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.068972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-config\") pod \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\" (UID: \"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64\") " Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.092777 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-kube-api-access-74wwb" (OuterVolumeSpecName: "kube-api-access-74wwb") pod "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" (UID: "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64"). InnerVolumeSpecName "kube-api-access-74wwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.154314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" (UID: "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.155188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" (UID: "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.159713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-config" (OuterVolumeSpecName: "config") pod "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" (UID: "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.165542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" (UID: "6f7d5f3d-0ff3-4241-ba84-ba9e492dca64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.177510 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.177540 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.177551 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.177559 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.177568 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74wwb\" (UniqueName: \"kubernetes.io/projected/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64-kube-api-access-74wwb\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.248276 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerID="ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d" exitCode=0 Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.248340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" event={"ID":"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64","Type":"ContainerDied","Data":"ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d"} Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.248367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" event={"ID":"6f7d5f3d-0ff3-4241-ba84-ba9e492dca64","Type":"ContainerDied","Data":"f186d2c2d7c0f29e8d2ec182342467e180950ff2a28df3adeab2d11891fbba66"} Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.248410 4749 scope.go:117] "RemoveContainer" containerID="ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.248622 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-659ddb758c-stfb7" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.279550 4749 scope.go:117] "RemoveContainer" containerID="476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.288372 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-stfb7"] Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.305897 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-659ddb758c-stfb7"] Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.318775 4749 scope.go:117] "RemoveContainer" containerID="ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d" Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.320172 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d\": container with ID starting with ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d not found: ID does not exist" containerID="ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.320218 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d"} err="failed to get container status \"ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d\": rpc error: code = NotFound desc = could not find container \"ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d\": container with ID starting with ef6e847ed7fe905a6c28fba2789a60db5628671a7f94b45ea68c0d9d6144904d not found: ID does not exist" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.320256 4749 scope.go:117] "RemoveContainer" containerID="476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0" Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.321575 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0\": container with ID starting with 476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0 not found: ID does not exist" containerID="476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.321613 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0"} err="failed to get container status \"476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0\": rpc error: code = NotFound desc = could not find container \"476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0\": container with ID starting with 476dbab2ba7df7b810605e89700815702263dd723f55b154c4b0174931847dc0 not found: ID does not exist" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.594790 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58df884995-tlc2v"] Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.653253 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.653602 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerName="dnsmasq-dns" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.653620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerName="dnsmasq-dns" Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.653636 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerName="init" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.653643 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerName="init" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.653800 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" containerName="dnsmasq-dns" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.659516 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.664576 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.664688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.664600 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-klks2" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.664636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.675123 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.788828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbxq\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-kube-api-access-zcbxq\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.789695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d50314-7d2d-4d92-9a78-846a573a3000-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.789763 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-cache\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.789787 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.789978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-lock\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.790158 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-lock\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbxq\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-kube-api-access-zcbxq\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d50314-7d2d-4d92-9a78-846a573a3000-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-cache\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.893821 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.893845 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.893873 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: E0310 16:07:38.893907 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift podName:85d50314-7d2d-4d92-9a78-846a573a3000 nodeName:}" failed. No retries permitted until 2026-03-10 16:07:39.393876405 +0000 UTC m=+1156.515742102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift") pod "swift-storage-0" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000") : configmap "swift-ring-files" not found Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.894170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-lock\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.894841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-cache\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.917114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d50314-7d2d-4d92-9a78-846a573a3000-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.919547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbxq\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-kube-api-access-zcbxq\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.919610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.961524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kxbww"] Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.963345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.966681 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.967090 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.967409 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.983093 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kxbww"] Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-combined-ca-bundle\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-ring-data-devices\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996280 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d88b3e71-b8ae-44cf-a104-3236bc27a87f-etc-swift\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-scripts\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhfm\" (UniqueName: \"kubernetes.io/projected/d88b3e71-b8ae-44cf-a104-3236bc27a87f-kube-api-access-mrhfm\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-dispersionconf\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:38 crc kubenswrapper[4749]: I0310 16:07:38.996605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-swiftconf\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.097789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d88b3e71-b8ae-44cf-a104-3236bc27a87f-etc-swift\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-scripts\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhfm\" (UniqueName: \"kubernetes.io/projected/d88b3e71-b8ae-44cf-a104-3236bc27a87f-kube-api-access-mrhfm\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-dispersionconf\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-swiftconf\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-combined-ca-bundle\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-ring-data-devices\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-scripts\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.098357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d88b3e71-b8ae-44cf-a104-3236bc27a87f-etc-swift\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.099445 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-ring-data-devices\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.102500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-swiftconf\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.102591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-dispersionconf\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.103238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-combined-ca-bundle\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.118867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhfm\" (UniqueName: \"kubernetes.io/projected/d88b3e71-b8ae-44cf-a104-3236bc27a87f-kube-api-access-mrhfm\") pod \"swift-ring-rebalance-kxbww\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.258751 4749 generic.go:334] "Generic (PLEG): container finished" podID="55c4b035-df08-431f-bee2-d02a4709086c" containerID="603b6b281fddc022de160bba7a0bb0d9919ef6c177c6e360d7e71a76f96d0e36" exitCode=0 Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.259011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-tlc2v" event={"ID":"55c4b035-df08-431f-bee2-d02a4709086c","Type":"ContainerDied","Data":"603b6b281fddc022de160bba7a0bb0d9919ef6c177c6e360d7e71a76f96d0e36"} Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.259117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-tlc2v" event={"ID":"55c4b035-df08-431f-bee2-d02a4709086c","Type":"ContainerStarted","Data":"2a548d1983e838050ed345201ab4005b153161bef3ed16e7fbb80231b2563b5e"} Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.296517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.403340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:39 crc kubenswrapper[4749]: E0310 16:07:39.403558 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 16:07:39 crc kubenswrapper[4749]: E0310 16:07:39.403843 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 16:07:39 crc kubenswrapper[4749]: E0310 16:07:39.403918 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift podName:85d50314-7d2d-4d92-9a78-846a573a3000 nodeName:}" failed. No retries permitted until 2026-03-10 16:07:40.403894555 +0000 UTC m=+1157.525760242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift") pod "swift-storage-0" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000") : configmap "swift-ring-files" not found Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.615862 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7d5f3d-0ff3-4241-ba84-ba9e492dca64" path="/var/lib/kubelet/pods/6f7d5f3d-0ff3-4241-ba84-ba9e492dca64/volumes" Mar 10 16:07:39 crc kubenswrapper[4749]: I0310 16:07:39.758944 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kxbww"] Mar 10 16:07:39 crc kubenswrapper[4749]: W0310 16:07:39.769679 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd88b3e71_b8ae_44cf_a104_3236bc27a87f.slice/crio-9f39e56e1accee8efd1bcd78acd3270c395fb172e9e5a7313dbd9bc6172a5f85 WatchSource:0}: Error finding container 9f39e56e1accee8efd1bcd78acd3270c395fb172e9e5a7313dbd9bc6172a5f85: Status 404 returned error can't find the container with id 9f39e56e1accee8efd1bcd78acd3270c395fb172e9e5a7313dbd9bc6172a5f85 Mar 10 16:07:40 crc kubenswrapper[4749]: I0310 16:07:40.271222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-tlc2v" event={"ID":"55c4b035-df08-431f-bee2-d02a4709086c","Type":"ContainerStarted","Data":"f685efb592052a56f6c94e2ace255971bb71e13b88fded9702e4c76fa73446ae"} Mar 10 16:07:40 crc kubenswrapper[4749]: I0310 16:07:40.271338 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:40 crc kubenswrapper[4749]: I0310 16:07:40.272962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kxbww" event={"ID":"d88b3e71-b8ae-44cf-a104-3236bc27a87f","Type":"ContainerStarted","Data":"9f39e56e1accee8efd1bcd78acd3270c395fb172e9e5a7313dbd9bc6172a5f85"} Mar 10 16:07:40 crc kubenswrapper[4749]: I0310 16:07:40.299425 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58df884995-tlc2v" podStartSLOduration=3.299354409 podStartE2EDuration="3.299354409s" podCreationTimestamp="2026-03-10 16:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:07:40.292723125 +0000 UTC m=+1157.414588812" watchObservedRunningTime="2026-03-10 16:07:40.299354409 +0000 UTC m=+1157.421220136" Mar 10 16:07:40 crc kubenswrapper[4749]: I0310 16:07:40.421034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:40 crc kubenswrapper[4749]: E0310 16:07:40.421265 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 16:07:40 crc kubenswrapper[4749]: E0310 16:07:40.421301 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 16:07:40 crc kubenswrapper[4749]: E0310 16:07:40.421368 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift podName:85d50314-7d2d-4d92-9a78-846a573a3000 nodeName:}" failed. No retries permitted until 2026-03-10 16:07:42.421346243 +0000 UTC m=+1159.543211940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift") pod "swift-storage-0" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000") : configmap "swift-ring-files" not found Mar 10 16:07:41 crc kubenswrapper[4749]: I0310 16:07:41.083904 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:41 crc kubenswrapper[4749]: I0310 16:07:41.183597 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 16:07:41 crc kubenswrapper[4749]: I0310 16:07:41.282113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a876bab-aa64-429f-bcb8-7e644cc4f547","Type":"ContainerStarted","Data":"179d78e0fc74dfc60f01357573da8d062315f456cade3c85a12fccf2e1aae2e5"} Mar 10 16:07:41 crc kubenswrapper[4749]: I0310 16:07:41.283130 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 16:07:41 crc kubenswrapper[4749]: I0310 16:07:41.300166 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.141964552 podStartE2EDuration="34.300142306s" podCreationTimestamp="2026-03-10 16:07:07 +0000 UTC" firstStartedPulling="2026-03-10 16:07:14.827493102 +0000 UTC m=+1131.949358789" lastFinishedPulling="2026-03-10 16:07:40.985670836 +0000 UTC m=+1158.107536543" observedRunningTime="2026-03-10 16:07:41.296594328 +0000 UTC m=+1158.418460015" watchObservedRunningTime="2026-03-10 16:07:41.300142306 +0000 UTC m=+1158.422007993" Mar 10 16:07:42 crc kubenswrapper[4749]: I0310 16:07:42.467955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:42 crc kubenswrapper[4749]: E0310 16:07:42.468163 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 16:07:42 crc kubenswrapper[4749]: E0310 16:07:42.468398 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 16:07:42 crc kubenswrapper[4749]: E0310 16:07:42.468482 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift podName:85d50314-7d2d-4d92-9a78-846a573a3000 nodeName:}" failed. No retries permitted until 2026-03-10 16:07:46.468459128 +0000 UTC m=+1163.590324815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift") pod "swift-storage-0" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000") : configmap "swift-ring-files" not found Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.560599 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.560915 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.660935 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.675680 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vrhf8"] Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.677462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.679872 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.693994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vrhf8"] Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.708614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d819df58-87d2-4495-8999-657b76f0e906-operator-scripts\") pod \"root-account-create-update-vrhf8\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.708774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl8sk\" (UniqueName: \"kubernetes.io/projected/d819df58-87d2-4495-8999-657b76f0e906-kube-api-access-cl8sk\") pod \"root-account-create-update-vrhf8\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.810338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d819df58-87d2-4495-8999-657b76f0e906-operator-scripts\") pod \"root-account-create-update-vrhf8\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.810477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl8sk\" (UniqueName: \"kubernetes.io/projected/d819df58-87d2-4495-8999-657b76f0e906-kube-api-access-cl8sk\") pod \"root-account-create-update-vrhf8\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.811290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d819df58-87d2-4495-8999-657b76f0e906-operator-scripts\") pod \"root-account-create-update-vrhf8\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.828963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl8sk\" (UniqueName: \"kubernetes.io/projected/d819df58-87d2-4495-8999-657b76f0e906-kube-api-access-cl8sk\") pod \"root-account-create-update-vrhf8\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:43 crc kubenswrapper[4749]: I0310 16:07:43.993499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:44 crc kubenswrapper[4749]: I0310 16:07:44.316170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kxbww" event={"ID":"d88b3e71-b8ae-44cf-a104-3236bc27a87f","Type":"ContainerStarted","Data":"940bacbed596a6b64b32192506d5b4b3e282715aad28c953f1f7f4c388805cb7"} Mar 10 16:07:44 crc kubenswrapper[4749]: I0310 16:07:44.351114 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kxbww" podStartSLOduration=2.79317102 podStartE2EDuration="6.351068222s" podCreationTimestamp="2026-03-10 16:07:38 +0000 UTC" firstStartedPulling="2026-03-10 16:07:39.772937355 +0000 UTC m=+1156.894803042" lastFinishedPulling="2026-03-10 16:07:43.330834557 +0000 UTC m=+1160.452700244" observedRunningTime="2026-03-10 16:07:44.331824569 +0000 UTC m=+1161.453690256" watchObservedRunningTime="2026-03-10 16:07:44.351068222 +0000 UTC m=+1161.472933909" Mar 10 16:07:44 crc kubenswrapper[4749]: I0310 16:07:44.398659 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 16:07:44 crc kubenswrapper[4749]: I0310 16:07:44.493930 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vrhf8"] Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.324329 4749 generic.go:334] "Generic (PLEG): container finished" podID="d819df58-87d2-4495-8999-657b76f0e906" containerID="d4691533207b9e7038404071271c9309ab895a8313e0edf96cf9f1552bd59949" exitCode=0 Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.324508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrhf8" event={"ID":"d819df58-87d2-4495-8999-657b76f0e906","Type":"ContainerDied","Data":"d4691533207b9e7038404071271c9309ab895a8313e0edf96cf9f1552bd59949"} Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.324688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrhf8" event={"ID":"d819df58-87d2-4495-8999-657b76f0e906","Type":"ContainerStarted","Data":"7bcc2f1388bf846dd7830ce2d624a1e1acdf0edff561f6db528a9b3322f1ad4c"} Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.449733 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vxt5v"] Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.450965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.474951 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vxt5v"] Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.538011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68ebbd2-399e-4e9d-938d-c3209c46d76a-operator-scripts\") pod \"glance-db-create-vxt5v\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.538164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jg9n\" (UniqueName: \"kubernetes.io/projected/b68ebbd2-399e-4e9d-938d-c3209c46d76a-kube-api-access-2jg9n\") pod \"glance-db-create-vxt5v\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.616302 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8701-account-create-update-lp5d6"] Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.617441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.622861 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.626940 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8701-account-create-update-lp5d6"] Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.639648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jg9n\" (UniqueName: \"kubernetes.io/projected/b68ebbd2-399e-4e9d-938d-c3209c46d76a-kube-api-access-2jg9n\") pod \"glance-db-create-vxt5v\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.639726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-operator-scripts\") pod \"glance-8701-account-create-update-lp5d6\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.639809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ql9\" (UniqueName: \"kubernetes.io/projected/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-kube-api-access-d7ql9\") pod \"glance-8701-account-create-update-lp5d6\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.639862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68ebbd2-399e-4e9d-938d-c3209c46d76a-operator-scripts\") pod \"glance-db-create-vxt5v\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.641737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68ebbd2-399e-4e9d-938d-c3209c46d76a-operator-scripts\") pod \"glance-db-create-vxt5v\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.668194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jg9n\" (UniqueName: \"kubernetes.io/projected/b68ebbd2-399e-4e9d-938d-c3209c46d76a-kube-api-access-2jg9n\") pod \"glance-db-create-vxt5v\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.741512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-operator-scripts\") pod \"glance-8701-account-create-update-lp5d6\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.741651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ql9\" (UniqueName: \"kubernetes.io/projected/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-kube-api-access-d7ql9\") pod \"glance-8701-account-create-update-lp5d6\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.742437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-operator-scripts\") pod \"glance-8701-account-create-update-lp5d6\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.758847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ql9\" (UniqueName: \"kubernetes.io/projected/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-kube-api-access-d7ql9\") pod \"glance-8701-account-create-update-lp5d6\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.776695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:45 crc kubenswrapper[4749]: I0310 16:07:45.936094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:46 crc kubenswrapper[4749]: W0310 16:07:46.222016 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb68ebbd2_399e_4e9d_938d_c3209c46d76a.slice/crio-4ed6102375627e4bad5737ed88f56927e94b1d2c4b3dde6ded3c024f8090dfd5 WatchSource:0}: Error finding container 4ed6102375627e4bad5737ed88f56927e94b1d2c4b3dde6ded3c024f8090dfd5: Status 404 returned error can't find the container with id 4ed6102375627e4bad5737ed88f56927e94b1d2c4b3dde6ded3c024f8090dfd5 Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.227601 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vxt5v"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.333302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vxt5v" event={"ID":"b68ebbd2-399e-4e9d-938d-c3209c46d76a","Type":"ContainerStarted","Data":"4ed6102375627e4bad5737ed88f56927e94b1d2c4b3dde6ded3c024f8090dfd5"} Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.434261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q5g9w"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.435893 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.452207 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q5g9w"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.478097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:46 crc kubenswrapper[4749]: E0310 16:07:46.479213 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 16:07:46 crc kubenswrapper[4749]: E0310 16:07:46.479238 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 16:07:46 crc kubenswrapper[4749]: E0310 16:07:46.479419 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift podName:85d50314-7d2d-4d92-9a78-846a573a3000 nodeName:}" failed. No retries permitted until 2026-03-10 16:07:54.479396124 +0000 UTC m=+1171.601261801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift") pod "swift-storage-0" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000") : configmap "swift-ring-files" not found Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.506236 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8701-account-create-update-lp5d6"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.572651 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0bf0-account-create-update-pxvvz"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.574055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.579181 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.580480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d73d287-764a-4274-a18c-f38c42e85d2d-operator-scripts\") pod \"keystone-db-create-q5g9w\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.580526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwp7z\" (UniqueName: \"kubernetes.io/projected/6d73d287-764a-4274-a18c-f38c42e85d2d-kube-api-access-kwp7z\") pod \"keystone-db-create-q5g9w\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.587610 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0bf0-account-create-update-pxvvz"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.661666 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-b5l2s"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.662835 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.681778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d73d287-764a-4274-a18c-f38c42e85d2d-operator-scripts\") pod \"keystone-db-create-q5g9w\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.681818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwp7z\" (UniqueName: \"kubernetes.io/projected/6d73d287-764a-4274-a18c-f38c42e85d2d-kube-api-access-kwp7z\") pod \"keystone-db-create-q5g9w\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.681887 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1120a11-86ed-4714-891a-04bcba2a8ea2-operator-scripts\") pod \"keystone-0bf0-account-create-update-pxvvz\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.681925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfqpr\" (UniqueName: \"kubernetes.io/projected/d1120a11-86ed-4714-891a-04bcba2a8ea2-kube-api-access-wfqpr\") pod \"keystone-0bf0-account-create-update-pxvvz\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.683885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d73d287-764a-4274-a18c-f38c42e85d2d-operator-scripts\") pod \"keystone-db-create-q5g9w\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.723736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b5l2s"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.784861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1120a11-86ed-4714-891a-04bcba2a8ea2-operator-scripts\") pod \"keystone-0bf0-account-create-update-pxvvz\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.784928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46tzn\" (UniqueName: \"kubernetes.io/projected/72ccda46-f693-4e5e-82e9-a874cafbceb8-kube-api-access-46tzn\") pod \"placement-db-create-b5l2s\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.784977 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfqpr\" (UniqueName: \"kubernetes.io/projected/d1120a11-86ed-4714-891a-04bcba2a8ea2-kube-api-access-wfqpr\") pod \"keystone-0bf0-account-create-update-pxvvz\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.785066 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72ccda46-f693-4e5e-82e9-a874cafbceb8-operator-scripts\") pod \"placement-db-create-b5l2s\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.786008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1120a11-86ed-4714-891a-04bcba2a8ea2-operator-scripts\") pod \"keystone-0bf0-account-create-update-pxvvz\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.810345 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-35cb-account-create-update-8zfvq"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.812127 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.814000 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.818264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35cb-account-create-update-8zfvq"] Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.827030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwp7z\" (UniqueName: \"kubernetes.io/projected/6d73d287-764a-4274-a18c-f38c42e85d2d-kube-api-access-kwp7z\") pod \"keystone-db-create-q5g9w\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.838681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfqpr\" (UniqueName: \"kubernetes.io/projected/d1120a11-86ed-4714-891a-04bcba2a8ea2-kube-api-access-wfqpr\") pod \"keystone-0bf0-account-create-update-pxvvz\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.886168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rk5x\" (UniqueName: \"kubernetes.io/projected/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-kube-api-access-2rk5x\") pod \"placement-35cb-account-create-update-8zfvq\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.886220 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46tzn\" (UniqueName: \"kubernetes.io/projected/72ccda46-f693-4e5e-82e9-a874cafbceb8-kube-api-access-46tzn\") pod \"placement-db-create-b5l2s\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.886289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-operator-scripts\") pod \"placement-35cb-account-create-update-8zfvq\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.886359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72ccda46-f693-4e5e-82e9-a874cafbceb8-operator-scripts\") pod \"placement-db-create-b5l2s\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.887733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72ccda46-f693-4e5e-82e9-a874cafbceb8-operator-scripts\") pod \"placement-db-create-b5l2s\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.904047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46tzn\" (UniqueName: \"kubernetes.io/projected/72ccda46-f693-4e5e-82e9-a874cafbceb8-kube-api-access-46tzn\") pod \"placement-db-create-b5l2s\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.906749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.989034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-operator-scripts\") pod \"placement-35cb-account-create-update-8zfvq\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.989952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-operator-scripts\") pod \"placement-35cb-account-create-update-8zfvq\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:46 crc kubenswrapper[4749]: I0310 16:07:46.990052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rk5x\" (UniqueName: \"kubernetes.io/projected/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-kube-api-access-2rk5x\") pod \"placement-35cb-account-create-update-8zfvq\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.015170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rk5x\" (UniqueName: \"kubernetes.io/projected/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-kube-api-access-2rk5x\") pod \"placement-35cb-account-create-update-8zfvq\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.054243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.055583 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.087251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.108649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.193989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl8sk\" (UniqueName: \"kubernetes.io/projected/d819df58-87d2-4495-8999-657b76f0e906-kube-api-access-cl8sk\") pod \"d819df58-87d2-4495-8999-657b76f0e906\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.194250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d819df58-87d2-4495-8999-657b76f0e906-operator-scripts\") pod \"d819df58-87d2-4495-8999-657b76f0e906\" (UID: \"d819df58-87d2-4495-8999-657b76f0e906\") " Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.196796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d819df58-87d2-4495-8999-657b76f0e906-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d819df58-87d2-4495-8999-657b76f0e906" (UID: "d819df58-87d2-4495-8999-657b76f0e906"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.201761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d819df58-87d2-4495-8999-657b76f0e906-kube-api-access-cl8sk" (OuterVolumeSpecName: "kube-api-access-cl8sk") pod "d819df58-87d2-4495-8999-657b76f0e906" (UID: "d819df58-87d2-4495-8999-657b76f0e906"). InnerVolumeSpecName "kube-api-access-cl8sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.296119 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl8sk\" (UniqueName: \"kubernetes.io/projected/d819df58-87d2-4495-8999-657b76f0e906-kube-api-access-cl8sk\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.296152 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d819df58-87d2-4495-8999-657b76f0e906-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.345111 4749 generic.go:334] "Generic (PLEG): container finished" podID="b68ebbd2-399e-4e9d-938d-c3209c46d76a" containerID="2a9d4b2ad47f1e5cf7acadb4a18e700e4bae86a432b69c0791bc4866a03bbcf6" exitCode=0 Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.345198 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vxt5v" event={"ID":"b68ebbd2-399e-4e9d-938d-c3209c46d76a","Type":"ContainerDied","Data":"2a9d4b2ad47f1e5cf7acadb4a18e700e4bae86a432b69c0791bc4866a03bbcf6"} Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.349075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrhf8" event={"ID":"d819df58-87d2-4495-8999-657b76f0e906","Type":"ContainerDied","Data":"7bcc2f1388bf846dd7830ce2d624a1e1acdf0edff561f6db528a9b3322f1ad4c"} Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.349116 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bcc2f1388bf846dd7830ce2d624a1e1acdf0edff561f6db528a9b3322f1ad4c" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.349190 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrhf8" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.351207 4749 generic.go:334] "Generic (PLEG): container finished" podID="41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" containerID="d750aa36eabfd8fb7f0b2d62c0c5e42060ba161fab559795c901f4a4f19af276" exitCode=0 Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.351248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8701-account-create-update-lp5d6" event={"ID":"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a","Type":"ContainerDied","Data":"d750aa36eabfd8fb7f0b2d62c0c5e42060ba161fab559795c901f4a4f19af276"} Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.351272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8701-account-create-update-lp5d6" event={"ID":"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a","Type":"ContainerStarted","Data":"a7f3daaa6f7e0bb34eb71d5e5f21eb86f08cb65ea8ac50157ef9675f66a662d7"} Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.397937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0bf0-account-create-update-pxvvz"] Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.549954 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35cb-account-create-update-8zfvq"] Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.563035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.662556 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b5l2s"] Mar 10 16:07:47 crc kubenswrapper[4749]: W0310 16:07:47.678571 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72ccda46_f693_4e5e_82e9_a874cafbceb8.slice/crio-e37c883745ce2d52db3cdfcf80ad25dad747e94bec39632bdb88cd52d7904dc8 WatchSource:0}: Error finding container e37c883745ce2d52db3cdfcf80ad25dad747e94bec39632bdb88cd52d7904dc8: Status 404 returned error can't find the container with id e37c883745ce2d52db3cdfcf80ad25dad747e94bec39632bdb88cd52d7904dc8 Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.736156 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q5g9w"] Mar 10 16:07:47 crc kubenswrapper[4749]: W0310 16:07:47.761307 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d73d287_764a_4274_a18c_f38c42e85d2d.slice/crio-1f1b97dfa7a3afd6840f08448124753a25e7da4dbd2ced1ddf25a437193311b4 WatchSource:0}: Error finding container 1f1b97dfa7a3afd6840f08448124753a25e7da4dbd2ced1ddf25a437193311b4: Status 404 returned error can't find the container with id 1f1b97dfa7a3afd6840f08448124753a25e7da4dbd2ced1ddf25a437193311b4 Mar 10 16:07:47 crc kubenswrapper[4749]: I0310 16:07:47.974435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.043745 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-x7462"] Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.044085 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerName="dnsmasq-dns" containerID="cri-o://2c4bb63f381069f7f5578676b40d32aa5572ac4ee8e5551e30910261f94f0589" gracePeriod=10 Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.376639 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1120a11-86ed-4714-891a-04bcba2a8ea2" containerID="e7b6c31999ed5399571573a74497b8409d438bd24ed0481b690c7a327019e8a7" exitCode=0 Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.377113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0bf0-account-create-update-pxvvz" event={"ID":"d1120a11-86ed-4714-891a-04bcba2a8ea2","Type":"ContainerDied","Data":"e7b6c31999ed5399571573a74497b8409d438bd24ed0481b690c7a327019e8a7"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.377141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0bf0-account-create-update-pxvvz" event={"ID":"d1120a11-86ed-4714-891a-04bcba2a8ea2","Type":"ContainerStarted","Data":"a5a0311aa9760a9ccddb8e493e87cfbbf9ec05f733d16ac5dac8e6028c30d019"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.381864 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d73d287-764a-4274-a18c-f38c42e85d2d" containerID="581d4900012f58fdc4fb204f974c1591440e9e62f7d5ce15a5af0e18990a645c" exitCode=0 Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.381920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q5g9w" event={"ID":"6d73d287-764a-4274-a18c-f38c42e85d2d","Type":"ContainerDied","Data":"581d4900012f58fdc4fb204f974c1591440e9e62f7d5ce15a5af0e18990a645c"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.381946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q5g9w" event={"ID":"6d73d287-764a-4274-a18c-f38c42e85d2d","Type":"ContainerStarted","Data":"1f1b97dfa7a3afd6840f08448124753a25e7da4dbd2ced1ddf25a437193311b4"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.387796 4749 generic.go:334] "Generic (PLEG): container finished" podID="72ccda46-f693-4e5e-82e9-a874cafbceb8" containerID="fa88134a23781b7668fa48a94dc185d173bf70de626fb3da7c54c3bf7925c7e0" exitCode=0 Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.387974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b5l2s" event={"ID":"72ccda46-f693-4e5e-82e9-a874cafbceb8","Type":"ContainerDied","Data":"fa88134a23781b7668fa48a94dc185d173bf70de626fb3da7c54c3bf7925c7e0"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.388003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b5l2s" event={"ID":"72ccda46-f693-4e5e-82e9-a874cafbceb8","Type":"ContainerStarted","Data":"e37c883745ce2d52db3cdfcf80ad25dad747e94bec39632bdb88cd52d7904dc8"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.393276 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerID="2c4bb63f381069f7f5578676b40d32aa5572ac4ee8e5551e30910261f94f0589" exitCode=0 Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.393351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" event={"ID":"ec39734d-eadc-4736-9eb5-98ad1c2a233c","Type":"ContainerDied","Data":"2c4bb63f381069f7f5578676b40d32aa5572ac4ee8e5551e30910261f94f0589"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.397390 4749 generic.go:334] "Generic (PLEG): container finished" podID="f643319b-ae93-4b48-b10d-7e7f5f27a7c6" containerID="3c06664a064c452407ad2995d00c78f80c14c3e61acf677ef85709e699cae912" exitCode=0 Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.397468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35cb-account-create-update-8zfvq" event={"ID":"f643319b-ae93-4b48-b10d-7e7f5f27a7c6","Type":"ContainerDied","Data":"3c06664a064c452407ad2995d00c78f80c14c3e61acf677ef85709e699cae912"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.397490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35cb-account-create-update-8zfvq" event={"ID":"f643319b-ae93-4b48-b10d-7e7f5f27a7c6","Type":"ContainerStarted","Data":"439a643c43cae456ed51b2ddf14e707a6c829894d6841ce76f6a28d1410e9bdc"} Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.529257 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.645950 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgx9\" (UniqueName: \"kubernetes.io/projected/ec39734d-eadc-4736-9eb5-98ad1c2a233c-kube-api-access-clgx9\") pod \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.646046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-config\") pod \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.646076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-dns-svc\") pod \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\" (UID: \"ec39734d-eadc-4736-9eb5-98ad1c2a233c\") " Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.653063 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec39734d-eadc-4736-9eb5-98ad1c2a233c-kube-api-access-clgx9" (OuterVolumeSpecName: "kube-api-access-clgx9") pod "ec39734d-eadc-4736-9eb5-98ad1c2a233c" (UID: "ec39734d-eadc-4736-9eb5-98ad1c2a233c"). InnerVolumeSpecName "kube-api-access-clgx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.733986 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-config" (OuterVolumeSpecName: "config") pod "ec39734d-eadc-4736-9eb5-98ad1c2a233c" (UID: "ec39734d-eadc-4736-9eb5-98ad1c2a233c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.758883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec39734d-eadc-4736-9eb5-98ad1c2a233c" (UID: "ec39734d-eadc-4736-9eb5-98ad1c2a233c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.759741 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgx9\" (UniqueName: \"kubernetes.io/projected/ec39734d-eadc-4736-9eb5-98ad1c2a233c-kube-api-access-clgx9\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.759768 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.759778 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec39734d-eadc-4736-9eb5-98ad1c2a233c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.895268 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:48 crc kubenswrapper[4749]: I0310 16:07:48.913151 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.065178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7ql9\" (UniqueName: \"kubernetes.io/projected/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-kube-api-access-d7ql9\") pod \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.065333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-operator-scripts\") pod \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\" (UID: \"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a\") " Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.065440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jg9n\" (UniqueName: \"kubernetes.io/projected/b68ebbd2-399e-4e9d-938d-c3209c46d76a-kube-api-access-2jg9n\") pod \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.065496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68ebbd2-399e-4e9d-938d-c3209c46d76a-operator-scripts\") pod \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\" (UID: \"b68ebbd2-399e-4e9d-938d-c3209c46d76a\") " Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.066151 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b68ebbd2-399e-4e9d-938d-c3209c46d76a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b68ebbd2-399e-4e9d-938d-c3209c46d76a" (UID: "b68ebbd2-399e-4e9d-938d-c3209c46d76a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.066188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" (UID: "41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.072310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68ebbd2-399e-4e9d-938d-c3209c46d76a-kube-api-access-2jg9n" (OuterVolumeSpecName: "kube-api-access-2jg9n") pod "b68ebbd2-399e-4e9d-938d-c3209c46d76a" (UID: "b68ebbd2-399e-4e9d-938d-c3209c46d76a"). InnerVolumeSpecName "kube-api-access-2jg9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.072468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-kube-api-access-d7ql9" (OuterVolumeSpecName: "kube-api-access-d7ql9") pod "41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" (UID: "41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a"). InnerVolumeSpecName "kube-api-access-d7ql9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.167246 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.167583 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jg9n\" (UniqueName: \"kubernetes.io/projected/b68ebbd2-399e-4e9d-938d-c3209c46d76a-kube-api-access-2jg9n\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.167597 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68ebbd2-399e-4e9d-938d-c3209c46d76a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.167607 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7ql9\" (UniqueName: \"kubernetes.io/projected/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a-kube-api-access-d7ql9\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.412279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" event={"ID":"ec39734d-eadc-4736-9eb5-98ad1c2a233c","Type":"ContainerDied","Data":"6b5066527d0dc45ca8e6fe18f4edbfe8254cf6aa8c7b4a8ce9eb796b9b6c8d13"} Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.412318 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-x7462" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.412407 4749 scope.go:117] "RemoveContainer" containerID="2c4bb63f381069f7f5578676b40d32aa5572ac4ee8e5551e30910261f94f0589" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.416801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8701-account-create-update-lp5d6" event={"ID":"41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a","Type":"ContainerDied","Data":"a7f3daaa6f7e0bb34eb71d5e5f21eb86f08cb65ea8ac50157ef9675f66a662d7"} Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.416845 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f3daaa6f7e0bb34eb71d5e5f21eb86f08cb65ea8ac50157ef9675f66a662d7" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.416940 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-lp5d6" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.421496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vxt5v" event={"ID":"b68ebbd2-399e-4e9d-938d-c3209c46d76a","Type":"ContainerDied","Data":"4ed6102375627e4bad5737ed88f56927e94b1d2c4b3dde6ded3c024f8090dfd5"} Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.421598 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ed6102375627e4bad5737ed88f56927e94b1d2c4b3dde6ded3c024f8090dfd5" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.421619 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vxt5v" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.470249 4749 scope.go:117] "RemoveContainer" containerID="9a063bf958dbfe07cdf4e542dce5d228c6e9130c53106cb68a3a0e9da9d593c8" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.471006 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-x7462"] Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.484008 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-x7462"] Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.617075 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" path="/var/lib/kubelet/pods/ec39734d-eadc-4736-9eb5-98ad1c2a233c/volumes" Mar 10 16:07:49 crc kubenswrapper[4749]: I0310 16:07:49.916518 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.096139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwp7z\" (UniqueName: \"kubernetes.io/projected/6d73d287-764a-4274-a18c-f38c42e85d2d-kube-api-access-kwp7z\") pod \"6d73d287-764a-4274-a18c-f38c42e85d2d\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.096246 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d73d287-764a-4274-a18c-f38c42e85d2d-operator-scripts\") pod \"6d73d287-764a-4274-a18c-f38c42e85d2d\" (UID: \"6d73d287-764a-4274-a18c-f38c42e85d2d\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.097140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d73d287-764a-4274-a18c-f38c42e85d2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d73d287-764a-4274-a18c-f38c42e85d2d" (UID: "6d73d287-764a-4274-a18c-f38c42e85d2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.103224 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d73d287-764a-4274-a18c-f38c42e85d2d-kube-api-access-kwp7z" (OuterVolumeSpecName: "kube-api-access-kwp7z") pod "6d73d287-764a-4274-a18c-f38c42e85d2d" (UID: "6d73d287-764a-4274-a18c-f38c42e85d2d"). InnerVolumeSpecName "kube-api-access-kwp7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.137357 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.152547 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.176061 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.198485 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwp7z\" (UniqueName: \"kubernetes.io/projected/6d73d287-764a-4274-a18c-f38c42e85d2d-kube-api-access-kwp7z\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.198532 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d73d287-764a-4274-a18c-f38c42e85d2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.299511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rk5x\" (UniqueName: \"kubernetes.io/projected/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-kube-api-access-2rk5x\") pod \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.300085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1120a11-86ed-4714-891a-04bcba2a8ea2-operator-scripts\") pod \"d1120a11-86ed-4714-891a-04bcba2a8ea2\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.300116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72ccda46-f693-4e5e-82e9-a874cafbceb8-operator-scripts\") pod \"72ccda46-f693-4e5e-82e9-a874cafbceb8\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.300153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfqpr\" (UniqueName: \"kubernetes.io/projected/d1120a11-86ed-4714-891a-04bcba2a8ea2-kube-api-access-wfqpr\") pod \"d1120a11-86ed-4714-891a-04bcba2a8ea2\" (UID: \"d1120a11-86ed-4714-891a-04bcba2a8ea2\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.300187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46tzn\" (UniqueName: \"kubernetes.io/projected/72ccda46-f693-4e5e-82e9-a874cafbceb8-kube-api-access-46tzn\") pod \"72ccda46-f693-4e5e-82e9-a874cafbceb8\" (UID: \"72ccda46-f693-4e5e-82e9-a874cafbceb8\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.300233 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-operator-scripts\") pod \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\" (UID: \"f643319b-ae93-4b48-b10d-7e7f5f27a7c6\") " Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.301257 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f643319b-ae93-4b48-b10d-7e7f5f27a7c6" (UID: "f643319b-ae93-4b48-b10d-7e7f5f27a7c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.301337 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ccda46-f693-4e5e-82e9-a874cafbceb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72ccda46-f693-4e5e-82e9-a874cafbceb8" (UID: "72ccda46-f693-4e5e-82e9-a874cafbceb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.301658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1120a11-86ed-4714-891a-04bcba2a8ea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1120a11-86ed-4714-891a-04bcba2a8ea2" (UID: "d1120a11-86ed-4714-891a-04bcba2a8ea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.304146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1120a11-86ed-4714-891a-04bcba2a8ea2-kube-api-access-wfqpr" (OuterVolumeSpecName: "kube-api-access-wfqpr") pod "d1120a11-86ed-4714-891a-04bcba2a8ea2" (UID: "d1120a11-86ed-4714-891a-04bcba2a8ea2"). InnerVolumeSpecName "kube-api-access-wfqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.304278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-kube-api-access-2rk5x" (OuterVolumeSpecName: "kube-api-access-2rk5x") pod "f643319b-ae93-4b48-b10d-7e7f5f27a7c6" (UID: "f643319b-ae93-4b48-b10d-7e7f5f27a7c6"). InnerVolumeSpecName "kube-api-access-2rk5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.304309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ccda46-f693-4e5e-82e9-a874cafbceb8-kube-api-access-46tzn" (OuterVolumeSpecName: "kube-api-access-46tzn") pod "72ccda46-f693-4e5e-82e9-a874cafbceb8" (UID: "72ccda46-f693-4e5e-82e9-a874cafbceb8"). InnerVolumeSpecName "kube-api-access-46tzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.402473 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1120a11-86ed-4714-891a-04bcba2a8ea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.402509 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72ccda46-f693-4e5e-82e9-a874cafbceb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.402524 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfqpr\" (UniqueName: \"kubernetes.io/projected/d1120a11-86ed-4714-891a-04bcba2a8ea2-kube-api-access-wfqpr\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.402537 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46tzn\" (UniqueName: \"kubernetes.io/projected/72ccda46-f693-4e5e-82e9-a874cafbceb8-kube-api-access-46tzn\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.402552 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.402564 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rk5x\" (UniqueName: \"kubernetes.io/projected/f643319b-ae93-4b48-b10d-7e7f5f27a7c6-kube-api-access-2rk5x\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.431856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35cb-account-create-update-8zfvq" event={"ID":"f643319b-ae93-4b48-b10d-7e7f5f27a7c6","Type":"ContainerDied","Data":"439a643c43cae456ed51b2ddf14e707a6c829894d6841ce76f6a28d1410e9bdc"} Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.431882 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35cb-account-create-update-8zfvq" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.431900 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="439a643c43cae456ed51b2ddf14e707a6c829894d6841ce76f6a28d1410e9bdc" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.433985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0bf0-account-create-update-pxvvz" event={"ID":"d1120a11-86ed-4714-891a-04bcba2a8ea2","Type":"ContainerDied","Data":"a5a0311aa9760a9ccddb8e493e87cfbbf9ec05f733d16ac5dac8e6028c30d019"} Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.434037 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a0311aa9760a9ccddb8e493e87cfbbf9ec05f733d16ac5dac8e6028c30d019" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.434117 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-pxvvz" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.440446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q5g9w" event={"ID":"6d73d287-764a-4274-a18c-f38c42e85d2d","Type":"ContainerDied","Data":"1f1b97dfa7a3afd6840f08448124753a25e7da4dbd2ced1ddf25a437193311b4"} Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.440492 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f1b97dfa7a3afd6840f08448124753a25e7da4dbd2ced1ddf25a437193311b4" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.440550 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q5g9w" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.445315 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b5l2s" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.445317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b5l2s" event={"ID":"72ccda46-f693-4e5e-82e9-a874cafbceb8","Type":"ContainerDied","Data":"e37c883745ce2d52db3cdfcf80ad25dad747e94bec39632bdb88cd52d7904dc8"} Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.445403 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e37c883745ce2d52db3cdfcf80ad25dad747e94bec39632bdb88cd52d7904dc8" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876461 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fptmr"] Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.876859 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876884 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.876904 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d819df58-87d2-4495-8999-657b76f0e906" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876912 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d819df58-87d2-4495-8999-657b76f0e906" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.876923 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ccda46-f693-4e5e-82e9-a874cafbceb8" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876933 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ccda46-f693-4e5e-82e9-a874cafbceb8" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.876946 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1120a11-86ed-4714-891a-04bcba2a8ea2" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876952 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1120a11-86ed-4714-891a-04bcba2a8ea2" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.876963 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f643319b-ae93-4b48-b10d-7e7f5f27a7c6" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876970 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f643319b-ae93-4b48-b10d-7e7f5f27a7c6" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.876986 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerName="dnsmasq-dns" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.876995 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerName="dnsmasq-dns" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.877011 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d73d287-764a-4274-a18c-f38c42e85d2d" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877019 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d73d287-764a-4274-a18c-f38c42e85d2d" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.877035 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68ebbd2-399e-4e9d-938d-c3209c46d76a" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877043 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68ebbd2-399e-4e9d-938d-c3209c46d76a" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: E0310 16:07:50.877053 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerName="init" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877061 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerName="init" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877252 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1120a11-86ed-4714-891a-04bcba2a8ea2" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877269 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877279 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f643319b-ae93-4b48-b10d-7e7f5f27a7c6" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877295 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ccda46-f693-4e5e-82e9-a874cafbceb8" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877305 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d73d287-764a-4274-a18c-f38c42e85d2d" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877320 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68ebbd2-399e-4e9d-938d-c3209c46d76a" containerName="mariadb-database-create" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877332 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d819df58-87d2-4495-8999-657b76f0e906" containerName="mariadb-account-create-update" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877343 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec39734d-eadc-4736-9eb5-98ad1c2a233c" containerName="dnsmasq-dns" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.877888 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.880462 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.882059 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tq4t4" Mar 10 16:07:50 crc kubenswrapper[4749]: I0310 16:07:50.887906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fptmr"] Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.017166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-combined-ca-bundle\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.017406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpklb\" (UniqueName: \"kubernetes.io/projected/709acaab-3856-4321-8076-f615a144105d-kube-api-access-dpklb\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.017549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-db-sync-config-data\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.017852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-config-data\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.119710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpklb\" (UniqueName: \"kubernetes.io/projected/709acaab-3856-4321-8076-f615a144105d-kube-api-access-dpklb\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.119794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-db-sync-config-data\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.119874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-config-data\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.119970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-combined-ca-bundle\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.125511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-db-sync-config-data\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.132031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-config-data\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.139500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-combined-ca-bundle\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.154665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpklb\" (UniqueName: \"kubernetes.io/projected/709acaab-3856-4321-8076-f615a144105d-kube-api-access-dpklb\") pod \"glance-db-sync-fptmr\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.197322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptmr" Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.468645 4749 generic.go:334] "Generic (PLEG): container finished" podID="d88b3e71-b8ae-44cf-a104-3236bc27a87f" containerID="940bacbed596a6b64b32192506d5b4b3e282715aad28c953f1f7f4c388805cb7" exitCode=0 Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.468708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kxbww" event={"ID":"d88b3e71-b8ae-44cf-a104-3236bc27a87f","Type":"ContainerDied","Data":"940bacbed596a6b64b32192506d5b4b3e282715aad28c953f1f7f4c388805cb7"} Mar 10 16:07:51 crc kubenswrapper[4749]: I0310 16:07:51.725707 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fptmr"] Mar 10 16:07:51 crc kubenswrapper[4749]: W0310 16:07:51.730250 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod709acaab_3856_4321_8076_f615a144105d.slice/crio-a91895416d20db9fd60db4ac4c1c9206ef2e77086f10d00a7d4ee693ff8237ce WatchSource:0}: Error finding container a91895416d20db9fd60db4ac4c1c9206ef2e77086f10d00a7d4ee693ff8237ce: Status 404 returned error can't find the container with id a91895416d20db9fd60db4ac4c1c9206ef2e77086f10d00a7d4ee693ff8237ce Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.199353 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vrhf8"] Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.210054 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vrhf8"] Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.218514 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cr5kc"] Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.219841 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.221816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.229175 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cr5kc"] Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.341616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhfm5\" (UniqueName: \"kubernetes.io/projected/e37f7a74-810d-4165-bdb4-eb70e15c4f97-kube-api-access-dhfm5\") pod \"root-account-create-update-cr5kc\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.341739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37f7a74-810d-4165-bdb4-eb70e15c4f97-operator-scripts\") pod \"root-account-create-update-cr5kc\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.443217 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37f7a74-810d-4165-bdb4-eb70e15c4f97-operator-scripts\") pod \"root-account-create-update-cr5kc\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.443335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhfm5\" (UniqueName: \"kubernetes.io/projected/e37f7a74-810d-4165-bdb4-eb70e15c4f97-kube-api-access-dhfm5\") pod \"root-account-create-update-cr5kc\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.444207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37f7a74-810d-4165-bdb4-eb70e15c4f97-operator-scripts\") pod \"root-account-create-update-cr5kc\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.460507 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhfm5\" (UniqueName: \"kubernetes.io/projected/e37f7a74-810d-4165-bdb4-eb70e15c4f97-kube-api-access-dhfm5\") pod \"root-account-create-update-cr5kc\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.476941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptmr" event={"ID":"709acaab-3856-4321-8076-f615a144105d","Type":"ContainerStarted","Data":"a91895416d20db9fd60db4ac4c1c9206ef2e77086f10d00a7d4ee693ff8237ce"} Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.543241 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.820414 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.953520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-swiftconf\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.953602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-scripts\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.953662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-dispersionconf\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.953690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d88b3e71-b8ae-44cf-a104-3236bc27a87f-etc-swift\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.953754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrhfm\" (UniqueName: \"kubernetes.io/projected/d88b3e71-b8ae-44cf-a104-3236bc27a87f-kube-api-access-mrhfm\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.953811 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-combined-ca-bundle\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.954326 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-ring-data-devices\") pod \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\" (UID: \"d88b3e71-b8ae-44cf-a104-3236bc27a87f\") " Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.954975 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.954974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d88b3e71-b8ae-44cf-a104-3236bc27a87f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.955317 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d88b3e71-b8ae-44cf-a104-3236bc27a87f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.955330 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.963533 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.967445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88b3e71-b8ae-44cf-a104-3236bc27a87f-kube-api-access-mrhfm" (OuterVolumeSpecName: "kube-api-access-mrhfm") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "kube-api-access-mrhfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.981611 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:07:52 crc kubenswrapper[4749]: I0310 16:07:52.982108 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-scripts" (OuterVolumeSpecName: "scripts") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.001699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d88b3e71-b8ae-44cf-a104-3236bc27a87f" (UID: "d88b3e71-b8ae-44cf-a104-3236bc27a87f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.050859 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cr5kc"] Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.057135 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.057164 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d88b3e71-b8ae-44cf-a104-3236bc27a87f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.057173 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.057183 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrhfm\" (UniqueName: \"kubernetes.io/projected/d88b3e71-b8ae-44cf-a104-3236bc27a87f-kube-api-access-mrhfm\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.057196 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88b3e71-b8ae-44cf-a104-3236bc27a87f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:53 crc kubenswrapper[4749]: W0310 16:07:53.059657 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode37f7a74_810d_4165_bdb4_eb70e15c4f97.slice/crio-a37671618bbe0528d4e71eebda457d9e38e001252e007c1090d77160a5ffc648 WatchSource:0}: Error finding container a37671618bbe0528d4e71eebda457d9e38e001252e007c1090d77160a5ffc648: Status 404 returned error can't find the container with id a37671618bbe0528d4e71eebda457d9e38e001252e007c1090d77160a5ffc648 Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.485938 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kxbww" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.485934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kxbww" event={"ID":"d88b3e71-b8ae-44cf-a104-3236bc27a87f","Type":"ContainerDied","Data":"9f39e56e1accee8efd1bcd78acd3270c395fb172e9e5a7313dbd9bc6172a5f85"} Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.486270 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f39e56e1accee8efd1bcd78acd3270c395fb172e9e5a7313dbd9bc6172a5f85" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.487504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cr5kc" event={"ID":"e37f7a74-810d-4165-bdb4-eb70e15c4f97","Type":"ContainerStarted","Data":"85e9d1c09ca680177c97807f990b0a5b07025c1d5b2201fcbd4caf12b78fdf20"} Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.487554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cr5kc" event={"ID":"e37f7a74-810d-4165-bdb4-eb70e15c4f97","Type":"ContainerStarted","Data":"a37671618bbe0528d4e71eebda457d9e38e001252e007c1090d77160a5ffc648"} Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.502826 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-cr5kc" podStartSLOduration=1.502812131 podStartE2EDuration="1.502812131s" podCreationTimestamp="2026-03-10 16:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:07:53.502272976 +0000 UTC m=+1170.624138663" watchObservedRunningTime="2026-03-10 16:07:53.502812131 +0000 UTC m=+1170.624677818" Mar 10 16:07:53 crc kubenswrapper[4749]: I0310 16:07:53.642354 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d819df58-87d2-4495-8999-657b76f0e906" path="/var/lib/kubelet/pods/d819df58-87d2-4495-8999-657b76f0e906/volumes" Mar 10 16:07:54 crc kubenswrapper[4749]: I0310 16:07:54.098417 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 16:07:54 crc kubenswrapper[4749]: I0310 16:07:54.480796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:54 crc kubenswrapper[4749]: I0310 16:07:54.490415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"swift-storage-0\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " pod="openstack/swift-storage-0" Mar 10 16:07:54 crc kubenswrapper[4749]: I0310 16:07:54.510523 4749 generic.go:334] "Generic (PLEG): container finished" podID="e37f7a74-810d-4165-bdb4-eb70e15c4f97" containerID="85e9d1c09ca680177c97807f990b0a5b07025c1d5b2201fcbd4caf12b78fdf20" exitCode=0 Mar 10 16:07:54 crc kubenswrapper[4749]: I0310 16:07:54.510587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cr5kc" event={"ID":"e37f7a74-810d-4165-bdb4-eb70e15c4f97","Type":"ContainerDied","Data":"85e9d1c09ca680177c97807f990b0a5b07025c1d5b2201fcbd4caf12b78fdf20"} Mar 10 16:07:54 crc kubenswrapper[4749]: I0310 16:07:54.583270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 16:07:55 crc kubenswrapper[4749]: I0310 16:07:55.125202 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 16:07:55 crc kubenswrapper[4749]: W0310 16:07:55.142669 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d50314_7d2d_4d92_9a78_846a573a3000.slice/crio-47f134fe8c0bcee5c546e8282ad7f5660df60ec57745ea899b947150cdb2e3fe WatchSource:0}: Error finding container 47f134fe8c0bcee5c546e8282ad7f5660df60ec57745ea899b947150cdb2e3fe: Status 404 returned error can't find the container with id 47f134fe8c0bcee5c546e8282ad7f5660df60ec57745ea899b947150cdb2e3fe Mar 10 16:07:55 crc kubenswrapper[4749]: I0310 16:07:55.525919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"47f134fe8c0bcee5c546e8282ad7f5660df60ec57745ea899b947150cdb2e3fe"} Mar 10 16:07:55 crc kubenswrapper[4749]: I0310 16:07:55.887926 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.015643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhfm5\" (UniqueName: \"kubernetes.io/projected/e37f7a74-810d-4165-bdb4-eb70e15c4f97-kube-api-access-dhfm5\") pod \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.015719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37f7a74-810d-4165-bdb4-eb70e15c4f97-operator-scripts\") pod \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\" (UID: \"e37f7a74-810d-4165-bdb4-eb70e15c4f97\") " Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.017085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37f7a74-810d-4165-bdb4-eb70e15c4f97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e37f7a74-810d-4165-bdb4-eb70e15c4f97" (UID: "e37f7a74-810d-4165-bdb4-eb70e15c4f97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.022933 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37f7a74-810d-4165-bdb4-eb70e15c4f97-kube-api-access-dhfm5" (OuterVolumeSpecName: "kube-api-access-dhfm5") pod "e37f7a74-810d-4165-bdb4-eb70e15c4f97" (UID: "e37f7a74-810d-4165-bdb4-eb70e15c4f97"). InnerVolumeSpecName "kube-api-access-dhfm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.118058 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhfm5\" (UniqueName: \"kubernetes.io/projected/e37f7a74-810d-4165-bdb4-eb70e15c4f97-kube-api-access-dhfm5\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.118096 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e37f7a74-810d-4165-bdb4-eb70e15c4f97-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.547712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cr5kc" event={"ID":"e37f7a74-810d-4165-bdb4-eb70e15c4f97","Type":"ContainerDied","Data":"a37671618bbe0528d4e71eebda457d9e38e001252e007c1090d77160a5ffc648"} Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.548030 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a37671618bbe0528d4e71eebda457d9e38e001252e007c1090d77160a5ffc648" Mar 10 16:07:56 crc kubenswrapper[4749]: I0310 16:07:56.548062 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cr5kc" Mar 10 16:07:57 crc kubenswrapper[4749]: I0310 16:07:57.652526 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 16:07:57 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 16:07:57 crc kubenswrapper[4749]: > Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.129451 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552648-q6zd8"] Mar 10 16:08:00 crc kubenswrapper[4749]: E0310 16:08:00.130075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88b3e71-b8ae-44cf-a104-3236bc27a87f" containerName="swift-ring-rebalance" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.130091 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88b3e71-b8ae-44cf-a104-3236bc27a87f" containerName="swift-ring-rebalance" Mar 10 16:08:00 crc kubenswrapper[4749]: E0310 16:08:00.130101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37f7a74-810d-4165-bdb4-eb70e15c4f97" containerName="mariadb-account-create-update" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.130107 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37f7a74-810d-4165-bdb4-eb70e15c4f97" containerName="mariadb-account-create-update" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.130360 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37f7a74-810d-4165-bdb4-eb70e15c4f97" containerName="mariadb-account-create-update" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.130402 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88b3e71-b8ae-44cf-a104-3236bc27a87f" containerName="swift-ring-rebalance" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.131003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.135720 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.135765 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.138992 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.147140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-q6zd8"] Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.287248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vpnf\" (UniqueName: \"kubernetes.io/projected/75e7e543-1b7f-49b7-af99-2677c8b0cd2c-kube-api-access-2vpnf\") pod \"auto-csr-approver-29552648-q6zd8\" (UID: \"75e7e543-1b7f-49b7-af99-2677c8b0cd2c\") " pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.389315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vpnf\" (UniqueName: \"kubernetes.io/projected/75e7e543-1b7f-49b7-af99-2677c8b0cd2c-kube-api-access-2vpnf\") pod \"auto-csr-approver-29552648-q6zd8\" (UID: \"75e7e543-1b7f-49b7-af99-2677c8b0cd2c\") " pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.406464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vpnf\" (UniqueName: \"kubernetes.io/projected/75e7e543-1b7f-49b7-af99-2677c8b0cd2c-kube-api-access-2vpnf\") pod \"auto-csr-approver-29552648-q6zd8\" (UID: \"75e7e543-1b7f-49b7-af99-2677c8b0cd2c\") " pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:00 crc kubenswrapper[4749]: I0310 16:08:00.451537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:01 crc kubenswrapper[4749]: I0310 16:08:01.594814 4749 generic.go:334] "Generic (PLEG): container finished" podID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerID="743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52" exitCode=0 Mar 10 16:08:01 crc kubenswrapper[4749]: I0310 16:08:01.595003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3","Type":"ContainerDied","Data":"743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52"} Mar 10 16:08:01 crc kubenswrapper[4749]: I0310 16:08:01.599916 4749 generic.go:334] "Generic (PLEG): container finished" podID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerID="367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802" exitCode=0 Mar 10 16:08:01 crc kubenswrapper[4749]: I0310 16:08:01.599946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1feaa4c9-2cec-45a8-9106-5be885c26eae","Type":"ContainerDied","Data":"367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802"} Mar 10 16:08:01 crc kubenswrapper[4749]: I0310 16:08:01.670791 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-q6zd8"] Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.655926 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 16:08:02 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 16:08:02 crc kubenswrapper[4749]: > Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.679075 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.686718 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.912368 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vms4g-config-s9jq8"] Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.914582 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.918513 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 16:08:02 crc kubenswrapper[4749]: I0310 16:08:02.922828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vms4g-config-s9jq8"] Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.045279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-log-ovn\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.045345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-additional-scripts\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.045433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-scripts\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.045483 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run-ovn\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.045531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsddp\" (UniqueName: \"kubernetes.io/projected/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-kube-api-access-fsddp\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.045624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-scripts\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run-ovn\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsddp\" (UniqueName: \"kubernetes.io/projected/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-kube-api-access-fsddp\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147336 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-log-ovn\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-additional-scripts\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.147930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-log-ovn\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.148069 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run-ovn\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.148269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-additional-scripts\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.150029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-scripts\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.164801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsddp\" (UniqueName: \"kubernetes.io/projected/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-kube-api-access-fsddp\") pod \"ovn-controller-vms4g-config-s9jq8\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:03 crc kubenswrapper[4749]: I0310 16:08:03.236338 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:07 crc kubenswrapper[4749]: I0310 16:08:07.653983 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 16:08:07 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 16:08:07 crc kubenswrapper[4749]: > Mar 10 16:08:09 crc kubenswrapper[4749]: E0310 16:08:09.026303 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07" Mar 10 16:08:09 crc kubenswrapper[4749]: E0310 16:08:09.028206 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpklb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-fptmr_openstack(709acaab-3856-4321-8076-f615a144105d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 16:08:09 crc kubenswrapper[4749]: E0310 16:08:09.029453 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-fptmr" podUID="709acaab-3856-4321-8076-f615a144105d" Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.307634 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.660197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1feaa4c9-2cec-45a8-9106-5be885c26eae","Type":"ContainerStarted","Data":"26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d"} Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.660753 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.662271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5"} Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.664226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3","Type":"ContainerStarted","Data":"3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a"} Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.664507 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.665405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" event={"ID":"75e7e543-1b7f-49b7-af99-2677c8b0cd2c","Type":"ContainerStarted","Data":"f87f1e59457bf6777cf9c85c553f2a1ebcaa0b9dc295c225160584804a8b22db"} Mar 10 16:08:09 crc kubenswrapper[4749]: E0310 16:08:09.667548 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07\\\"\"" pod="openstack/glance-db-sync-fptmr" podUID="709acaab-3856-4321-8076-f615a144105d" Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.715918 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=59.50149162 podStartE2EDuration="1m9.715901498s" podCreationTimestamp="2026-03-10 16:07:00 +0000 UTC" firstStartedPulling="2026-03-10 16:07:14.85742843 +0000 UTC m=+1131.979294107" lastFinishedPulling="2026-03-10 16:07:25.071838298 +0000 UTC m=+1142.193703985" observedRunningTime="2026-03-10 16:08:09.686638398 +0000 UTC m=+1186.808504125" watchObservedRunningTime="2026-03-10 16:08:09.715901498 +0000 UTC m=+1186.837767185" Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.717443 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.335578297 podStartE2EDuration="1m8.717432559s" podCreationTimestamp="2026-03-10 16:07:01 +0000 UTC" firstStartedPulling="2026-03-10 16:07:13.97619034 +0000 UTC m=+1131.098056037" lastFinishedPulling="2026-03-10 16:07:26.358044612 +0000 UTC m=+1143.479910299" observedRunningTime="2026-03-10 16:08:09.709246603 +0000 UTC m=+1186.831112290" watchObservedRunningTime="2026-03-10 16:08:09.717432559 +0000 UTC m=+1186.839298246" Mar 10 16:08:09 crc kubenswrapper[4749]: I0310 16:08:09.763297 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vms4g-config-s9jq8"] Mar 10 16:08:10 crc kubenswrapper[4749]: I0310 16:08:10.676629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c"} Mar 10 16:08:10 crc kubenswrapper[4749]: I0310 16:08:10.677644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065"} Mar 10 16:08:10 crc kubenswrapper[4749]: I0310 16:08:10.677662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620"} Mar 10 16:08:10 crc kubenswrapper[4749]: I0310 16:08:10.682835 4749 generic.go:334] "Generic (PLEG): container finished" podID="c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" containerID="8f39916e690931ab258bcb7e6275112f699ad8897bb791ad2e833c45747e6c9f" exitCode=0 Mar 10 16:08:10 crc kubenswrapper[4749]: I0310 16:08:10.682893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g-config-s9jq8" event={"ID":"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23","Type":"ContainerDied","Data":"8f39916e690931ab258bcb7e6275112f699ad8897bb791ad2e833c45747e6c9f"} Mar 10 16:08:10 crc kubenswrapper[4749]: I0310 16:08:10.682965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g-config-s9jq8" event={"ID":"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23","Type":"ContainerStarted","Data":"b594dad203ea3f1e0f29f86a0702bcec60b4c6d36af762ee18ba0542d1eed06c"} Mar 10 16:08:11 crc kubenswrapper[4749]: I0310 16:08:11.695644 4749 generic.go:334] "Generic (PLEG): container finished" podID="75e7e543-1b7f-49b7-af99-2677c8b0cd2c" containerID="e4bba11be08f890ec02580c33cdb287c152b439af58261322ddf2d38ca38ae5a" exitCode=0 Mar 10 16:08:11 crc kubenswrapper[4749]: I0310 16:08:11.695758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" event={"ID":"75e7e543-1b7f-49b7-af99-2677c8b0cd2c","Type":"ContainerDied","Data":"e4bba11be08f890ec02580c33cdb287c152b439af58261322ddf2d38ca38ae5a"} Mar 10 16:08:11 crc kubenswrapper[4749]: I0310 16:08:11.700413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9"} Mar 10 16:08:11 crc kubenswrapper[4749]: I0310 16:08:11.958224 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.011675 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-additional-scripts\") pod \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.011741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsddp\" (UniqueName: \"kubernetes.io/projected/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-kube-api-access-fsddp\") pod \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.011795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-log-ovn\") pod \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.011855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run-ovn\") pod \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.011952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-scripts\") pod \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.011984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run\") pod \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\" (UID: \"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23\") " Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.012329 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run" (OuterVolumeSpecName: "var-run") pod "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" (UID: "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.012340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" (UID: "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.012596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" (UID: "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.013071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" (UID: "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.014085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-scripts" (OuterVolumeSpecName: "scripts") pod "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" (UID: "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.016671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-kube-api-access-fsddp" (OuterVolumeSpecName: "kube-api-access-fsddp") pod "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" (UID: "c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23"). InnerVolumeSpecName "kube-api-access-fsddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.113623 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.113662 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.113674 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.113685 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsddp\" (UniqueName: \"kubernetes.io/projected/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-kube-api-access-fsddp\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.113694 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.113702 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.670040 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vms4g" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.708947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g-config-s9jq8" event={"ID":"c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23","Type":"ContainerDied","Data":"b594dad203ea3f1e0f29f86a0702bcec60b4c6d36af762ee18ba0542d1eed06c"} Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.709000 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b594dad203ea3f1e0f29f86a0702bcec60b4c6d36af762ee18ba0542d1eed06c" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.709068 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g-config-s9jq8" Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.718163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc"} Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.718235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3"} Mar 10 16:08:12 crc kubenswrapper[4749]: I0310 16:08:12.718255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d"} Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.151668 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.199144 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vms4g-config-s9jq8"] Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.214797 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vms4g-config-s9jq8"] Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.238998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vpnf\" (UniqueName: \"kubernetes.io/projected/75e7e543-1b7f-49b7-af99-2677c8b0cd2c-kube-api-access-2vpnf\") pod \"75e7e543-1b7f-49b7-af99-2677c8b0cd2c\" (UID: \"75e7e543-1b7f-49b7-af99-2677c8b0cd2c\") " Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.251912 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e7e543-1b7f-49b7-af99-2677c8b0cd2c-kube-api-access-2vpnf" (OuterVolumeSpecName: "kube-api-access-2vpnf") pod "75e7e543-1b7f-49b7-af99-2677c8b0cd2c" (UID: "75e7e543-1b7f-49b7-af99-2677c8b0cd2c"). InnerVolumeSpecName "kube-api-access-2vpnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.340637 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vpnf\" (UniqueName: \"kubernetes.io/projected/75e7e543-1b7f-49b7-af99-2677c8b0cd2c-kube-api-access-2vpnf\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.618837 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" path="/var/lib/kubelet/pods/c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23/volumes" Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.726832 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" event={"ID":"75e7e543-1b7f-49b7-af99-2677c8b0cd2c","Type":"ContainerDied","Data":"f87f1e59457bf6777cf9c85c553f2a1ebcaa0b9dc295c225160584804a8b22db"} Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.727163 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f87f1e59457bf6777cf9c85c553f2a1ebcaa0b9dc295c225160584804a8b22db" Mar 10 16:08:13 crc kubenswrapper[4749]: I0310 16:08:13.726861 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-q6zd8" Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.227327 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-9nnnj"] Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.234715 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-9nnnj"] Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.739344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5"} Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.739399 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb"} Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.739409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29"} Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.739417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a"} Mar 10 16:08:14 crc kubenswrapper[4749]: I0310 16:08:14.739427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f"} Mar 10 16:08:15 crc kubenswrapper[4749]: I0310 16:08:15.631491 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460ee780-c5e3-437a-9a2a-3ed268e2173a" path="/var/lib/kubelet/pods/460ee780-c5e3-437a-9a2a-3ed268e2173a/volumes" Mar 10 16:08:15 crc kubenswrapper[4749]: I0310 16:08:15.754130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576"} Mar 10 16:08:15 crc kubenswrapper[4749]: I0310 16:08:15.754440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerStarted","Data":"250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177"} Mar 10 16:08:15 crc kubenswrapper[4749]: I0310 16:08:15.798428 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.253809275 podStartE2EDuration="38.798406444s" podCreationTimestamp="2026-03-10 16:07:37 +0000 UTC" firstStartedPulling="2026-03-10 16:07:55.145577249 +0000 UTC m=+1172.267442936" lastFinishedPulling="2026-03-10 16:08:13.690174418 +0000 UTC m=+1190.812040105" observedRunningTime="2026-03-10 16:08:15.792315186 +0000 UTC m=+1192.914180893" watchObservedRunningTime="2026-03-10 16:08:15.798406444 +0000 UTC m=+1192.920272131" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.045767 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-w4nhb"] Mar 10 16:08:16 crc kubenswrapper[4749]: E0310 16:08:16.046186 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e7e543-1b7f-49b7-af99-2677c8b0cd2c" containerName="oc" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.046211 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e7e543-1b7f-49b7-af99-2677c8b0cd2c" containerName="oc" Mar 10 16:08:16 crc kubenswrapper[4749]: E0310 16:08:16.046247 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" containerName="ovn-config" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.046259 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" containerName="ovn-config" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.046550 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e7e543-1b7f-49b7-af99-2677c8b0cd2c" containerName="oc" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.046582 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a0f4f2-4a68-4a08-a08a-e0fca3c12f23" containerName="ovn-config" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.047666 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.049886 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.067663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-w4nhb"] Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.089998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-sb\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.090054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfmd\" (UniqueName: \"kubernetes.io/projected/3446bf6c-8fd9-491e-aee2-87c44e34c315-kube-api-access-2jfmd\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.090078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-swift-storage-0\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.090144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-svc\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.090195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-nb\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.090213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-config\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.191226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-svc\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.191337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-nb\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.191395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-config\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.191422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-sb\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.191456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfmd\" (UniqueName: \"kubernetes.io/projected/3446bf6c-8fd9-491e-aee2-87c44e34c315-kube-api-access-2jfmd\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.191482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-swift-storage-0\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.192170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-svc\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.192452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-swift-storage-0\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.192600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-config\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.192972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-sb\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.193352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-nb\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.210759 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfmd\" (UniqueName: \"kubernetes.io/projected/3446bf6c-8fd9-491e-aee2-87c44e34c315-kube-api-access-2jfmd\") pod \"dnsmasq-dns-58f8d8dcc-w4nhb\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.363489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:16 crc kubenswrapper[4749]: W0310 16:08:16.847269 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3446bf6c_8fd9_491e_aee2_87c44e34c315.slice/crio-8899f1cc41262a45c5fb09e083cd8b618e6dfe8f46274646946d31b192b3f027 WatchSource:0}: Error finding container 8899f1cc41262a45c5fb09e083cd8b618e6dfe8f46274646946d31b192b3f027: Status 404 returned error can't find the container with id 8899f1cc41262a45c5fb09e083cd8b618e6dfe8f46274646946d31b192b3f027 Mar 10 16:08:16 crc kubenswrapper[4749]: I0310 16:08:16.857409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-w4nhb"] Mar 10 16:08:17 crc kubenswrapper[4749]: I0310 16:08:17.768759 4749 generic.go:334] "Generic (PLEG): container finished" podID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerID="abf39b43e1b7826151e86c6bb1f90364c93edd1de0334b34bcca2d5d93db8d3a" exitCode=0 Mar 10 16:08:17 crc kubenswrapper[4749]: I0310 16:08:17.768938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" event={"ID":"3446bf6c-8fd9-491e-aee2-87c44e34c315","Type":"ContainerDied","Data":"abf39b43e1b7826151e86c6bb1f90364c93edd1de0334b34bcca2d5d93db8d3a"} Mar 10 16:08:17 crc kubenswrapper[4749]: I0310 16:08:17.769029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" event={"ID":"3446bf6c-8fd9-491e-aee2-87c44e34c315","Type":"ContainerStarted","Data":"8899f1cc41262a45c5fb09e083cd8b618e6dfe8f46274646946d31b192b3f027"} Mar 10 16:08:18 crc kubenswrapper[4749]: I0310 16:08:18.779147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" event={"ID":"3446bf6c-8fd9-491e-aee2-87c44e34c315","Type":"ContainerStarted","Data":"255fe3e862784ea5711146921e64697383c4d78a74be22864fdd88cee5a1010a"} Mar 10 16:08:18 crc kubenswrapper[4749]: I0310 16:08:18.779724 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:18 crc kubenswrapper[4749]: I0310 16:08:18.812600 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" podStartSLOduration=2.812558402 podStartE2EDuration="2.812558402s" podCreationTimestamp="2026-03-10 16:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:18.804212782 +0000 UTC m=+1195.926078469" watchObservedRunningTime="2026-03-10 16:08:18.812558402 +0000 UTC m=+1195.934424089" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.112633 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.449533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.464018 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-l9q7r"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.465604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.474317 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l9q7r"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.587856 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9d56-account-create-update-txqfx"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.592993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.595078 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.599213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9d56-account-create-update-txqfx"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.621925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fltnw\" (UniqueName: \"kubernetes.io/projected/489f0d93-0a70-476e-b7ab-7db40933bf88-kube-api-access-fltnw\") pod \"cinder-db-create-l9q7r\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.622052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489f0d93-0a70-476e-b7ab-7db40933bf88-operator-scripts\") pod \"cinder-db-create-l9q7r\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.724532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rp7\" (UniqueName: \"kubernetes.io/projected/11d152df-3150-4e52-ac41-1288d89383c2-kube-api-access-d8rp7\") pod \"cinder-9d56-account-create-update-txqfx\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.725141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d152df-3150-4e52-ac41-1288d89383c2-operator-scripts\") pod \"cinder-9d56-account-create-update-txqfx\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.726517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fltnw\" (UniqueName: \"kubernetes.io/projected/489f0d93-0a70-476e-b7ab-7db40933bf88-kube-api-access-fltnw\") pod \"cinder-db-create-l9q7r\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.727111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489f0d93-0a70-476e-b7ab-7db40933bf88-operator-scripts\") pod \"cinder-db-create-l9q7r\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.730439 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f5f78"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.731518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.732724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489f0d93-0a70-476e-b7ab-7db40933bf88-operator-scripts\") pod \"cinder-db-create-l9q7r\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.747567 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5f78"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.777543 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c29f-account-create-update-5jmdq"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.778801 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.783218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.783687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fltnw\" (UniqueName: \"kubernetes.io/projected/489f0d93-0a70-476e-b7ab-7db40933bf88-kube-api-access-fltnw\") pod \"cinder-db-create-l9q7r\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.797828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.803757 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c29f-account-create-update-5jmdq"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.830956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rp7\" (UniqueName: \"kubernetes.io/projected/11d152df-3150-4e52-ac41-1288d89383c2-kube-api-access-d8rp7\") pod \"cinder-9d56-account-create-update-txqfx\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.831330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d152df-3150-4e52-ac41-1288d89383c2-operator-scripts\") pod \"cinder-9d56-account-create-update-txqfx\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.832029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d152df-3150-4e52-ac41-1288d89383c2-operator-scripts\") pod \"cinder-9d56-account-create-update-txqfx\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.859856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rp7\" (UniqueName: \"kubernetes.io/projected/11d152df-3150-4e52-ac41-1288d89383c2-kube-api-access-d8rp7\") pod \"cinder-9d56-account-create-update-txqfx\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.878949 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mrdvb"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.879936 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.884655 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.884655 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.884947 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ppc5j" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.885017 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.890262 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qtzqs"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.891251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.900254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qtzqs"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.908280 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mrdvb"] Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.919920 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.933630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2w2\" (UniqueName: \"kubernetes.io/projected/1e966eb8-aa23-4b7a-8477-1e6e321054f9-kube-api-access-8s2w2\") pod \"barbican-db-create-f5f78\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.933724 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9ww6\" (UniqueName: \"kubernetes.io/projected/8c820403-de63-4498-b9b9-f9881586293a-kube-api-access-f9ww6\") pod \"barbican-c29f-account-create-update-5jmdq\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.933778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e966eb8-aa23-4b7a-8477-1e6e321054f9-operator-scripts\") pod \"barbican-db-create-f5f78\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:22 crc kubenswrapper[4749]: I0310 16:08:22.933817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c820403-de63-4498-b9b9-f9881586293a-operator-scripts\") pod \"barbican-c29f-account-create-update-5jmdq\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.009316 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-964c-account-create-update-vg7qc"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.010640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.017168 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.021146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-964c-account-create-update-vg7qc"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9ww6\" (UniqueName: \"kubernetes.io/projected/8c820403-de63-4498-b9b9-f9881586293a-kube-api-access-f9ww6\") pod \"barbican-c29f-account-create-update-5jmdq\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfknd\" (UniqueName: \"kubernetes.io/projected/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-kube-api-access-tfknd\") pod \"neutron-db-create-qtzqs\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-config-data\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e966eb8-aa23-4b7a-8477-1e6e321054f9-operator-scripts\") pod \"barbican-db-create-f5f78\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-operator-scripts\") pod \"neutron-db-create-qtzqs\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c820403-de63-4498-b9b9-f9881586293a-operator-scripts\") pod \"barbican-c29f-account-create-update-5jmdq\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5njk\" (UniqueName: \"kubernetes.io/projected/846a5266-babb-4653-8226-952d8e09d90e-kube-api-access-b5njk\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-combined-ca-bundle\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.036462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s2w2\" (UniqueName: \"kubernetes.io/projected/1e966eb8-aa23-4b7a-8477-1e6e321054f9-kube-api-access-8s2w2\") pod \"barbican-db-create-f5f78\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.038084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e966eb8-aa23-4b7a-8477-1e6e321054f9-operator-scripts\") pod \"barbican-db-create-f5f78\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.039205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c820403-de63-4498-b9b9-f9881586293a-operator-scripts\") pod \"barbican-c29f-account-create-update-5jmdq\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.061785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s2w2\" (UniqueName: \"kubernetes.io/projected/1e966eb8-aa23-4b7a-8477-1e6e321054f9-kube-api-access-8s2w2\") pod \"barbican-db-create-f5f78\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.065289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9ww6\" (UniqueName: \"kubernetes.io/projected/8c820403-de63-4498-b9b9-f9881586293a-kube-api-access-f9ww6\") pod \"barbican-c29f-account-create-update-5jmdq\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.098507 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.122118 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5njk\" (UniqueName: \"kubernetes.io/projected/846a5266-babb-4653-8226-952d8e09d90e-kube-api-access-b5njk\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2s2\" (UniqueName: \"kubernetes.io/projected/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-kube-api-access-gs2s2\") pod \"neutron-964c-account-create-update-vg7qc\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-operator-scripts\") pod \"neutron-964c-account-create-update-vg7qc\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-combined-ca-bundle\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfknd\" (UniqueName: \"kubernetes.io/projected/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-kube-api-access-tfknd\") pod \"neutron-db-create-qtzqs\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-config-data\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.141915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-operator-scripts\") pod \"neutron-db-create-qtzqs\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.142868 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-operator-scripts\") pod \"neutron-db-create-qtzqs\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.147335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-combined-ca-bundle\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.159861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-config-data\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.168809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfknd\" (UniqueName: \"kubernetes.io/projected/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-kube-api-access-tfknd\") pod \"neutron-db-create-qtzqs\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.169638 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5njk\" (UniqueName: \"kubernetes.io/projected/846a5266-babb-4653-8226-952d8e09d90e-kube-api-access-b5njk\") pod \"keystone-db-sync-mrdvb\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.235267 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.245436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2s2\" (UniqueName: \"kubernetes.io/projected/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-kube-api-access-gs2s2\") pod \"neutron-964c-account-create-update-vg7qc\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.245509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-operator-scripts\") pod \"neutron-964c-account-create-update-vg7qc\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.246437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-operator-scripts\") pod \"neutron-964c-account-create-update-vg7qc\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.269266 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2s2\" (UniqueName: \"kubernetes.io/projected/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-kube-api-access-gs2s2\") pod \"neutron-964c-account-create-update-vg7qc\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.322453 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.339296 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.383139 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l9q7r"] Mar 10 16:08:23 crc kubenswrapper[4749]: W0310 16:08:23.407447 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod489f0d93_0a70_476e_b7ab_7db40933bf88.slice/crio-8043380f55ed7da7533828190001d3087dd0059c8dc787256d607e35593eb9ff WatchSource:0}: Error finding container 8043380f55ed7da7533828190001d3087dd0059c8dc787256d607e35593eb9ff: Status 404 returned error can't find the container with id 8043380f55ed7da7533828190001d3087dd0059c8dc787256d607e35593eb9ff Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.448919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5f78"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.503272 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c29f-account-create-update-5jmdq"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.521540 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9d56-account-create-update-txqfx"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.550902 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.567494 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mrdvb"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.838607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l9q7r" event={"ID":"489f0d93-0a70-476e-b7ab-7db40933bf88","Type":"ContainerStarted","Data":"8043380f55ed7da7533828190001d3087dd0059c8dc787256d607e35593eb9ff"} Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.839752 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qtzqs"] Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.840866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qtzqs" event={"ID":"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60","Type":"ContainerStarted","Data":"1beeb96ce10c261e0f7a45c28abf9c8dc7bbff7701da529db99f3ad91c1fa045"} Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.844020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdvb" event={"ID":"846a5266-babb-4653-8226-952d8e09d90e","Type":"ContainerStarted","Data":"7637f6fdc46e752da5738baa23e2bf9f40c11e749e59a8058d4751ca6d75eaad"} Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.845200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5f78" event={"ID":"1e966eb8-aa23-4b7a-8477-1e6e321054f9","Type":"ContainerStarted","Data":"fc059892474576f13608a7bae64ff2986303407fd46ca9b19efb47b354d66e3e"} Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.845905 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.846067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c29f-account-create-update-5jmdq" event={"ID":"8c820403-de63-4498-b9b9-f9881586293a","Type":"ContainerStarted","Data":"1019826045b57621bd52ec7d5b0fd06688e207d1b72cf37514e33554d3057cf4"} Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.848364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9d56-account-create-update-txqfx" event={"ID":"11d152df-3150-4e52-ac41-1288d89383c2","Type":"ContainerStarted","Data":"a6bc6c0562d61bcb45c406d0d882ad47db16e084b26b47b5566a378432f4fed4"} Mar 10 16:08:23 crc kubenswrapper[4749]: I0310 16:08:23.858790 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-964c-account-create-update-vg7qc"] Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.877689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qtzqs" event={"ID":"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60","Type":"ContainerStarted","Data":"a1c3b653dfffe4b6f5efd250adb2e58d82df46e074759a1515886f87c5c6b213"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.883961 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-964c-account-create-update-vg7qc" event={"ID":"8f657e52-1b31-417c-8cf2-093bd5c6b8f2","Type":"ContainerStarted","Data":"73c803d28033d094dfccb98e8dc460d72e72e5409cdd11305221dfb63a577787"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.884012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-964c-account-create-update-vg7qc" event={"ID":"8f657e52-1b31-417c-8cf2-093bd5c6b8f2","Type":"ContainerStarted","Data":"09569aa0a764fc0a97abf028612983db4832521aa757636464fd2bb4a3c706e3"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.887138 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e966eb8-aa23-4b7a-8477-1e6e321054f9" containerID="d5e76815f4569aaac9c8ae12048a95a98eba3d3491725cb5136e86634b3ac30e" exitCode=0 Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.887207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5f78" event={"ID":"1e966eb8-aa23-4b7a-8477-1e6e321054f9","Type":"ContainerDied","Data":"d5e76815f4569aaac9c8ae12048a95a98eba3d3491725cb5136e86634b3ac30e"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.893105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c29f-account-create-update-5jmdq" event={"ID":"8c820403-de63-4498-b9b9-f9881586293a","Type":"ContainerStarted","Data":"19ae9480ebac8218d9075c95d6eaac4eeee5be1571a897161bbfe018546b3e2f"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.897960 4749 generic.go:334] "Generic (PLEG): container finished" podID="11d152df-3150-4e52-ac41-1288d89383c2" containerID="304c08e229751c0227f2fdf35b286aa22b36f511bc4c299b58bf50e726217a1f" exitCode=0 Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.898019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9d56-account-create-update-txqfx" event={"ID":"11d152df-3150-4e52-ac41-1288d89383c2","Type":"ContainerDied","Data":"304c08e229751c0227f2fdf35b286aa22b36f511bc4c299b58bf50e726217a1f"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.937974 4749 generic.go:334] "Generic (PLEG): container finished" podID="489f0d93-0a70-476e-b7ab-7db40933bf88" containerID="efb0880b40e62eca1c113563c2bc06265d339a84125d6c22137d5537d40d5f14" exitCode=0 Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.938063 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l9q7r" event={"ID":"489f0d93-0a70-476e-b7ab-7db40933bf88","Type":"ContainerDied","Data":"efb0880b40e62eca1c113563c2bc06265d339a84125d6c22137d5537d40d5f14"} Mar 10 16:08:24 crc kubenswrapper[4749]: I0310 16:08:24.964867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptmr" event={"ID":"709acaab-3856-4321-8076-f615a144105d","Type":"ContainerStarted","Data":"8d2299df2487e769d3166ee36c6b6f3c511bda098ff25e59c2d9dbf96576abe9"} Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.053068 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-c29f-account-create-update-5jmdq" podStartSLOduration=3.053044499 podStartE2EDuration="3.053044499s" podCreationTimestamp="2026-03-10 16:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:25.05055918 +0000 UTC m=+1202.172424867" watchObservedRunningTime="2026-03-10 16:08:25.053044499 +0000 UTC m=+1202.174910186" Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.067831 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fptmr" podStartSLOduration=3.276434857 podStartE2EDuration="35.067810598s" podCreationTimestamp="2026-03-10 16:07:50 +0000 UTC" firstStartedPulling="2026-03-10 16:07:51.732941856 +0000 UTC m=+1168.854807543" lastFinishedPulling="2026-03-10 16:08:23.524317597 +0000 UTC m=+1200.646183284" observedRunningTime="2026-03-10 16:08:25.066560583 +0000 UTC m=+1202.188426270" watchObservedRunningTime="2026-03-10 16:08:25.067810598 +0000 UTC m=+1202.189676285" Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.979301 4749 generic.go:334] "Generic (PLEG): container finished" podID="8c820403-de63-4498-b9b9-f9881586293a" containerID="19ae9480ebac8218d9075c95d6eaac4eeee5be1571a897161bbfe018546b3e2f" exitCode=0 Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.979654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c29f-account-create-update-5jmdq" event={"ID":"8c820403-de63-4498-b9b9-f9881586293a","Type":"ContainerDied","Data":"19ae9480ebac8218d9075c95d6eaac4eeee5be1571a897161bbfe018546b3e2f"} Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.982795 4749 generic.go:334] "Generic (PLEG): container finished" podID="2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" containerID="a1c3b653dfffe4b6f5efd250adb2e58d82df46e074759a1515886f87c5c6b213" exitCode=0 Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.982849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qtzqs" event={"ID":"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60","Type":"ContainerDied","Data":"a1c3b653dfffe4b6f5efd250adb2e58d82df46e074759a1515886f87c5c6b213"} Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.987179 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f657e52-1b31-417c-8cf2-093bd5c6b8f2" containerID="73c803d28033d094dfccb98e8dc460d72e72e5409cdd11305221dfb63a577787" exitCode=0 Mar 10 16:08:25 crc kubenswrapper[4749]: I0310 16:08:25.987473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-964c-account-create-update-vg7qc" event={"ID":"8f657e52-1b31-417c-8cf2-093bd5c6b8f2","Type":"ContainerDied","Data":"73c803d28033d094dfccb98e8dc460d72e72e5409cdd11305221dfb63a577787"} Mar 10 16:08:26 crc kubenswrapper[4749]: I0310 16:08:26.365650 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:26 crc kubenswrapper[4749]: I0310 16:08:26.462061 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-tlc2v"] Mar 10 16:08:26 crc kubenswrapper[4749]: I0310 16:08:26.462503 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58df884995-tlc2v" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="dnsmasq-dns" containerID="cri-o://f685efb592052a56f6c94e2ace255971bb71e13b88fded9702e4c76fa73446ae" gracePeriod=10 Mar 10 16:08:27 crc kubenswrapper[4749]: I0310 16:08:27.003216 4749 generic.go:334] "Generic (PLEG): container finished" podID="55c4b035-df08-431f-bee2-d02a4709086c" containerID="f685efb592052a56f6c94e2ace255971bb71e13b88fded9702e4c76fa73446ae" exitCode=0 Mar 10 16:08:27 crc kubenswrapper[4749]: I0310 16:08:27.003399 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-tlc2v" event={"ID":"55c4b035-df08-431f-bee2-d02a4709086c","Type":"ContainerDied","Data":"f685efb592052a56f6c94e2ace255971bb71e13b88fded9702e4c76fa73446ae"} Mar 10 16:08:27 crc kubenswrapper[4749]: I0310 16:08:27.973049 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58df884995-tlc2v" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.025453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-964c-account-create-update-vg7qc" event={"ID":"8f657e52-1b31-417c-8cf2-093bd5c6b8f2","Type":"ContainerDied","Data":"09569aa0a764fc0a97abf028612983db4832521aa757636464fd2bb4a3c706e3"} Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.025512 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09569aa0a764fc0a97abf028612983db4832521aa757636464fd2bb4a3c706e3" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.027710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9d56-account-create-update-txqfx" event={"ID":"11d152df-3150-4e52-ac41-1288d89383c2","Type":"ContainerDied","Data":"a6bc6c0562d61bcb45c406d0d882ad47db16e084b26b47b5566a378432f4fed4"} Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.027758 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6bc6c0562d61bcb45c406d0d882ad47db16e084b26b47b5566a378432f4fed4" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.028899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qtzqs" event={"ID":"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60","Type":"ContainerDied","Data":"1beeb96ce10c261e0f7a45c28abf9c8dc7bbff7701da529db99f3ad91c1fa045"} Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.028932 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1beeb96ce10c261e0f7a45c28abf9c8dc7bbff7701da529db99f3ad91c1fa045" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.265674 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.268936 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.276544 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.282808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8rp7\" (UniqueName: \"kubernetes.io/projected/11d152df-3150-4e52-ac41-1288d89383c2-kube-api-access-d8rp7\") pod \"11d152df-3150-4e52-ac41-1288d89383c2\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.282869 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-operator-scripts\") pod \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.284101 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f657e52-1b31-417c-8cf2-093bd5c6b8f2" (UID: "8f657e52-1b31-417c-8cf2-093bd5c6b8f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.284192 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" (UID: "2b118176-15b4-4d8c-a2d4-8bc3e53dcd60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.284310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-operator-scripts\") pod \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.284354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfknd\" (UniqueName: \"kubernetes.io/projected/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-kube-api-access-tfknd\") pod \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\" (UID: \"2b118176-15b4-4d8c-a2d4-8bc3e53dcd60\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.285234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs2s2\" (UniqueName: \"kubernetes.io/projected/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-kube-api-access-gs2s2\") pod \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\" (UID: \"8f657e52-1b31-417c-8cf2-093bd5c6b8f2\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.285388 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d152df-3150-4e52-ac41-1288d89383c2-operator-scripts\") pod \"11d152df-3150-4e52-ac41-1288d89383c2\" (UID: \"11d152df-3150-4e52-ac41-1288d89383c2\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.287319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d152df-3150-4e52-ac41-1288d89383c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11d152df-3150-4e52-ac41-1288d89383c2" (UID: "11d152df-3150-4e52-ac41-1288d89383c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.288941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d152df-3150-4e52-ac41-1288d89383c2-kube-api-access-d8rp7" (OuterVolumeSpecName: "kube-api-access-d8rp7") pod "11d152df-3150-4e52-ac41-1288d89383c2" (UID: "11d152df-3150-4e52-ac41-1288d89383c2"). InnerVolumeSpecName "kube-api-access-d8rp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.290890 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8rp7\" (UniqueName: \"kubernetes.io/projected/11d152df-3150-4e52-ac41-1288d89383c2-kube-api-access-d8rp7\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.290916 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.290950 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.290961 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11d152df-3150-4e52-ac41-1288d89383c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.299849 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-kube-api-access-tfknd" (OuterVolumeSpecName: "kube-api-access-tfknd") pod "2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" (UID: "2b118176-15b4-4d8c-a2d4-8bc3e53dcd60"). InnerVolumeSpecName "kube-api-access-tfknd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.308816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-kube-api-access-gs2s2" (OuterVolumeSpecName: "kube-api-access-gs2s2") pod "8f657e52-1b31-417c-8cf2-093bd5c6b8f2" (UID: "8f657e52-1b31-417c-8cf2-093bd5c6b8f2"). InnerVolumeSpecName "kube-api-access-gs2s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.327991 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.333069 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.364912 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.375237 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.391815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-nb\") pod \"55c4b035-df08-431f-bee2-d02a4709086c\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s2w2\" (UniqueName: \"kubernetes.io/projected/1e966eb8-aa23-4b7a-8477-1e6e321054f9-kube-api-access-8s2w2\") pod \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9ww6\" (UniqueName: \"kubernetes.io/projected/8c820403-de63-4498-b9b9-f9881586293a-kube-api-access-f9ww6\") pod \"8c820403-de63-4498-b9b9-f9881586293a\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-dns-svc\") pod \"55c4b035-df08-431f-bee2-d02a4709086c\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489f0d93-0a70-476e-b7ab-7db40933bf88-operator-scripts\") pod \"489f0d93-0a70-476e-b7ab-7db40933bf88\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c820403-de63-4498-b9b9-f9881586293a-operator-scripts\") pod \"8c820403-de63-4498-b9b9-f9881586293a\" (UID: \"8c820403-de63-4498-b9b9-f9881586293a\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392832 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-sb\") pod \"55c4b035-df08-431f-bee2-d02a4709086c\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e966eb8-aa23-4b7a-8477-1e6e321054f9-operator-scripts\") pod \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\" (UID: \"1e966eb8-aa23-4b7a-8477-1e6e321054f9\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.392980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-config\") pod \"55c4b035-df08-431f-bee2-d02a4709086c\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.393042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fltnw\" (UniqueName: \"kubernetes.io/projected/489f0d93-0a70-476e-b7ab-7db40933bf88-kube-api-access-fltnw\") pod \"489f0d93-0a70-476e-b7ab-7db40933bf88\" (UID: \"489f0d93-0a70-476e-b7ab-7db40933bf88\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.393108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2phk4\" (UniqueName: \"kubernetes.io/projected/55c4b035-df08-431f-bee2-d02a4709086c-kube-api-access-2phk4\") pod \"55c4b035-df08-431f-bee2-d02a4709086c\" (UID: \"55c4b035-df08-431f-bee2-d02a4709086c\") " Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.393710 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfknd\" (UniqueName: \"kubernetes.io/projected/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60-kube-api-access-tfknd\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.393808 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs2s2\" (UniqueName: \"kubernetes.io/projected/8f657e52-1b31-417c-8cf2-093bd5c6b8f2-kube-api-access-gs2s2\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.393766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c820403-de63-4498-b9b9-f9881586293a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c820403-de63-4498-b9b9-f9881586293a" (UID: "8c820403-de63-4498-b9b9-f9881586293a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.395675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489f0d93-0a70-476e-b7ab-7db40933bf88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "489f0d93-0a70-476e-b7ab-7db40933bf88" (UID: "489f0d93-0a70-476e-b7ab-7db40933bf88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.396184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e966eb8-aa23-4b7a-8477-1e6e321054f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e966eb8-aa23-4b7a-8477-1e6e321054f9" (UID: "1e966eb8-aa23-4b7a-8477-1e6e321054f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.402857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c4b035-df08-431f-bee2-d02a4709086c-kube-api-access-2phk4" (OuterVolumeSpecName: "kube-api-access-2phk4") pod "55c4b035-df08-431f-bee2-d02a4709086c" (UID: "55c4b035-df08-431f-bee2-d02a4709086c"). InnerVolumeSpecName "kube-api-access-2phk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.402972 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c820403-de63-4498-b9b9-f9881586293a-kube-api-access-f9ww6" (OuterVolumeSpecName: "kube-api-access-f9ww6") pod "8c820403-de63-4498-b9b9-f9881586293a" (UID: "8c820403-de63-4498-b9b9-f9881586293a"). InnerVolumeSpecName "kube-api-access-f9ww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.421721 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e966eb8-aa23-4b7a-8477-1e6e321054f9-kube-api-access-8s2w2" (OuterVolumeSpecName: "kube-api-access-8s2w2") pod "1e966eb8-aa23-4b7a-8477-1e6e321054f9" (UID: "1e966eb8-aa23-4b7a-8477-1e6e321054f9"). InnerVolumeSpecName "kube-api-access-8s2w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.424249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489f0d93-0a70-476e-b7ab-7db40933bf88-kube-api-access-fltnw" (OuterVolumeSpecName: "kube-api-access-fltnw") pod "489f0d93-0a70-476e-b7ab-7db40933bf88" (UID: "489f0d93-0a70-476e-b7ab-7db40933bf88"). InnerVolumeSpecName "kube-api-access-fltnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.471096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-config" (OuterVolumeSpecName: "config") pod "55c4b035-df08-431f-bee2-d02a4709086c" (UID: "55c4b035-df08-431f-bee2-d02a4709086c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.472357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55c4b035-df08-431f-bee2-d02a4709086c" (UID: "55c4b035-df08-431f-bee2-d02a4709086c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.478045 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55c4b035-df08-431f-bee2-d02a4709086c" (UID: "55c4b035-df08-431f-bee2-d02a4709086c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.479748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55c4b035-df08-431f-bee2-d02a4709086c" (UID: "55c4b035-df08-431f-bee2-d02a4709086c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495165 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9ww6\" (UniqueName: \"kubernetes.io/projected/8c820403-de63-4498-b9b9-f9881586293a-kube-api-access-f9ww6\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495204 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495215 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489f0d93-0a70-476e-b7ab-7db40933bf88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495226 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c820403-de63-4498-b9b9-f9881586293a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495234 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495243 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e966eb8-aa23-4b7a-8477-1e6e321054f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495252 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495263 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fltnw\" (UniqueName: \"kubernetes.io/projected/489f0d93-0a70-476e-b7ab-7db40933bf88-kube-api-access-fltnw\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495274 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2phk4\" (UniqueName: \"kubernetes.io/projected/55c4b035-df08-431f-bee2-d02a4709086c-kube-api-access-2phk4\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495282 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55c4b035-df08-431f-bee2-d02a4709086c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:29 crc kubenswrapper[4749]: I0310 16:08:29.495291 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s2w2\" (UniqueName: \"kubernetes.io/projected/1e966eb8-aa23-4b7a-8477-1e6e321054f9-kube-api-access-8s2w2\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.037710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdvb" event={"ID":"846a5266-babb-4653-8226-952d8e09d90e","Type":"ContainerStarted","Data":"8d114f77fae8a932170e8cb48be64485ebed83fe5f6058822201930cc9f723a6"} Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.039293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5f78" event={"ID":"1e966eb8-aa23-4b7a-8477-1e6e321054f9","Type":"ContainerDied","Data":"fc059892474576f13608a7bae64ff2986303407fd46ca9b19efb47b354d66e3e"} Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.039362 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc059892474576f13608a7bae64ff2986303407fd46ca9b19efb47b354d66e3e" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.039354 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f78" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.041700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58df884995-tlc2v" event={"ID":"55c4b035-df08-431f-bee2-d02a4709086c","Type":"ContainerDied","Data":"2a548d1983e838050ed345201ab4005b153161bef3ed16e7fbb80231b2563b5e"} Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.041731 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58df884995-tlc2v" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.041745 4749 scope.go:117] "RemoveContainer" containerID="f685efb592052a56f6c94e2ace255971bb71e13b88fded9702e4c76fa73446ae" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.043304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c29f-account-create-update-5jmdq" event={"ID":"8c820403-de63-4498-b9b9-f9881586293a","Type":"ContainerDied","Data":"1019826045b57621bd52ec7d5b0fd06688e207d1b72cf37514e33554d3057cf4"} Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.043327 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c29f-account-create-update-5jmdq" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.043333 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1019826045b57621bd52ec7d5b0fd06688e207d1b72cf37514e33554d3057cf4" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.045215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l9q7r" event={"ID":"489f0d93-0a70-476e-b7ab-7db40933bf88","Type":"ContainerDied","Data":"8043380f55ed7da7533828190001d3087dd0059c8dc787256d607e35593eb9ff"} Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.045244 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8043380f55ed7da7533828190001d3087dd0059c8dc787256d607e35593eb9ff" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.045271 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qtzqs" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.045304 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9d56-account-create-update-txqfx" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.045336 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-964c-account-create-update-vg7qc" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.045357 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l9q7r" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.068424 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mrdvb" podStartSLOduration=2.657497386 podStartE2EDuration="8.068407563s" podCreationTimestamp="2026-03-10 16:08:22 +0000 UTC" firstStartedPulling="2026-03-10 16:08:23.690691889 +0000 UTC m=+1200.812557586" lastFinishedPulling="2026-03-10 16:08:29.101602066 +0000 UTC m=+1206.223467763" observedRunningTime="2026-03-10 16:08:30.06471504 +0000 UTC m=+1207.186580727" watchObservedRunningTime="2026-03-10 16:08:30.068407563 +0000 UTC m=+1207.190273250" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.096856 4749 scope.go:117] "RemoveContainer" containerID="603b6b281fddc022de160bba7a0bb0d9919ef6c177c6e360d7e71a76f96d0e36" Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.145888 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58df884995-tlc2v"] Mar 10 16:08:30 crc kubenswrapper[4749]: I0310 16:08:30.154964 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58df884995-tlc2v"] Mar 10 16:08:31 crc kubenswrapper[4749]: I0310 16:08:31.619650 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c4b035-df08-431f-bee2-d02a4709086c" path="/var/lib/kubelet/pods/55c4b035-df08-431f-bee2-d02a4709086c/volumes" Mar 10 16:08:34 crc kubenswrapper[4749]: I0310 16:08:34.104264 4749 generic.go:334] "Generic (PLEG): container finished" podID="709acaab-3856-4321-8076-f615a144105d" containerID="8d2299df2487e769d3166ee36c6b6f3c511bda098ff25e59c2d9dbf96576abe9" exitCode=0 Mar 10 16:08:34 crc kubenswrapper[4749]: I0310 16:08:34.104369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptmr" event={"ID":"709acaab-3856-4321-8076-f615a144105d","Type":"ContainerDied","Data":"8d2299df2487e769d3166ee36c6b6f3c511bda098ff25e59c2d9dbf96576abe9"} Mar 10 16:08:34 crc kubenswrapper[4749]: I0310 16:08:34.111294 4749 generic.go:334] "Generic (PLEG): container finished" podID="846a5266-babb-4653-8226-952d8e09d90e" containerID="8d114f77fae8a932170e8cb48be64485ebed83fe5f6058822201930cc9f723a6" exitCode=0 Mar 10 16:08:34 crc kubenswrapper[4749]: I0310 16:08:34.111351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdvb" event={"ID":"846a5266-babb-4653-8226-952d8e09d90e","Type":"ContainerDied","Data":"8d114f77fae8a932170e8cb48be64485ebed83fe5f6058822201930cc9f723a6"} Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.473531 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.513989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-combined-ca-bundle\") pod \"846a5266-babb-4653-8226-952d8e09d90e\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.514066 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-config-data\") pod \"846a5266-babb-4653-8226-952d8e09d90e\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.514282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5njk\" (UniqueName: \"kubernetes.io/projected/846a5266-babb-4653-8226-952d8e09d90e-kube-api-access-b5njk\") pod \"846a5266-babb-4653-8226-952d8e09d90e\" (UID: \"846a5266-babb-4653-8226-952d8e09d90e\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.519455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846a5266-babb-4653-8226-952d8e09d90e-kube-api-access-b5njk" (OuterVolumeSpecName: "kube-api-access-b5njk") pod "846a5266-babb-4653-8226-952d8e09d90e" (UID: "846a5266-babb-4653-8226-952d8e09d90e"). InnerVolumeSpecName "kube-api-access-b5njk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.538742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "846a5266-babb-4653-8226-952d8e09d90e" (UID: "846a5266-babb-4653-8226-952d8e09d90e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.564240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-config-data" (OuterVolumeSpecName: "config-data") pod "846a5266-babb-4653-8226-952d8e09d90e" (UID: "846a5266-babb-4653-8226-952d8e09d90e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.616882 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.616914 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/846a5266-babb-4653-8226-952d8e09d90e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.616926 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5njk\" (UniqueName: \"kubernetes.io/projected/846a5266-babb-4653-8226-952d8e09d90e-kube-api-access-b5njk\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.619785 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptmr" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.718131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpklb\" (UniqueName: \"kubernetes.io/projected/709acaab-3856-4321-8076-f615a144105d-kube-api-access-dpklb\") pod \"709acaab-3856-4321-8076-f615a144105d\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.718605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-config-data\") pod \"709acaab-3856-4321-8076-f615a144105d\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.718719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-combined-ca-bundle\") pod \"709acaab-3856-4321-8076-f615a144105d\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.718786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-db-sync-config-data\") pod \"709acaab-3856-4321-8076-f615a144105d\" (UID: \"709acaab-3856-4321-8076-f615a144105d\") " Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.722429 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "709acaab-3856-4321-8076-f615a144105d" (UID: "709acaab-3856-4321-8076-f615a144105d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.723944 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709acaab-3856-4321-8076-f615a144105d-kube-api-access-dpklb" (OuterVolumeSpecName: "kube-api-access-dpklb") pod "709acaab-3856-4321-8076-f615a144105d" (UID: "709acaab-3856-4321-8076-f615a144105d"). InnerVolumeSpecName "kube-api-access-dpklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.739887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "709acaab-3856-4321-8076-f615a144105d" (UID: "709acaab-3856-4321-8076-f615a144105d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.772742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-config-data" (OuterVolumeSpecName: "config-data") pod "709acaab-3856-4321-8076-f615a144105d" (UID: "709acaab-3856-4321-8076-f615a144105d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.820013 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.820051 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.820065 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709acaab-3856-4321-8076-f615a144105d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:35 crc kubenswrapper[4749]: I0310 16:08:35.820077 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpklb\" (UniqueName: \"kubernetes.io/projected/709acaab-3856-4321-8076-f615a144105d-kube-api-access-dpklb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.141880 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptmr" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.142462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptmr" event={"ID":"709acaab-3856-4321-8076-f615a144105d","Type":"ContainerDied","Data":"a91895416d20db9fd60db4ac4c1c9206ef2e77086f10d00a7d4ee693ff8237ce"} Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.142713 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a91895416d20db9fd60db4ac4c1c9206ef2e77086f10d00a7d4ee693ff8237ce" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.143801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mrdvb" event={"ID":"846a5266-babb-4653-8226-952d8e09d90e","Type":"ContainerDied","Data":"7637f6fdc46e752da5738baa23e2bf9f40c11e749e59a8058d4751ca6d75eaad"} Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.143840 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7637f6fdc46e752da5738baa23e2bf9f40c11e749e59a8058d4751ca6d75eaad" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.143877 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mrdvb" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.508714 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-m8lbg"] Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509069 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="init" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509081 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="init" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509094 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f657e52-1b31-417c-8cf2-093bd5c6b8f2" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509100 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f657e52-1b31-417c-8cf2-093bd5c6b8f2" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509117 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="dnsmasq-dns" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509123 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="dnsmasq-dns" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509134 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d152df-3150-4e52-ac41-1288d89383c2" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509141 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d152df-3150-4e52-ac41-1288d89383c2" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509149 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e966eb8-aa23-4b7a-8477-1e6e321054f9" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509155 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e966eb8-aa23-4b7a-8477-1e6e321054f9" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489f0d93-0a70-476e-b7ab-7db40933bf88" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509173 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="489f0d93-0a70-476e-b7ab-7db40933bf88" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846a5266-babb-4653-8226-952d8e09d90e" containerName="keystone-db-sync" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509191 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="846a5266-babb-4653-8226-952d8e09d90e" containerName="keystone-db-sync" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509204 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509210 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509219 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709acaab-3856-4321-8076-f615a144105d" containerName="glance-db-sync" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509226 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="709acaab-3856-4321-8076-f615a144105d" containerName="glance-db-sync" Mar 10 16:08:36 crc kubenswrapper[4749]: E0310 16:08:36.509237 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c820403-de63-4498-b9b9-f9881586293a" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509243 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c820403-de63-4498-b9b9-f9881586293a" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509423 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e966eb8-aa23-4b7a-8477-1e6e321054f9" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509436 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509447 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f657e52-1b31-417c-8cf2-093bd5c6b8f2" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509459 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="489f0d93-0a70-476e-b7ab-7db40933bf88" containerName="mariadb-database-create" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509468 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d152df-3150-4e52-ac41-1288d89383c2" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509478 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c820403-de63-4498-b9b9-f9881586293a" containerName="mariadb-account-create-update" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509489 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="846a5266-babb-4653-8226-952d8e09d90e" containerName="keystone-db-sync" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509499 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c4b035-df08-431f-bee2-d02a4709086c" containerName="dnsmasq-dns" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.509511 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="709acaab-3856-4321-8076-f615a144105d" containerName="glance-db-sync" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.518588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.532481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-svc\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.532561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-config\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.532597 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-nb\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.532640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwmz\" (UniqueName: \"kubernetes.io/projected/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-kube-api-access-hwwmz\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.532662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-swift-storage-0\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.532728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-sb\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.539837 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vl2v7"] Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.541496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.557631 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ppc5j" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.557836 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.558012 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.558114 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.558256 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.560595 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-m8lbg"] Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.575471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vl2v7"] Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.635813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-sb\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.635894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-svc\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.635927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-config\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.635946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-nb\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.635972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwmz\" (UniqueName: \"kubernetes.io/projected/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-kube-api-access-hwwmz\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.635990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-swift-storage-0\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.636867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-swift-storage-0\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.637134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-svc\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.637356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-config\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.637878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-nb\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.638321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-sb\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.673197 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwmz\" (UniqueName: \"kubernetes.io/projected/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-kube-api-access-hwwmz\") pod \"dnsmasq-dns-679ccc59c7-m8lbg\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.723018 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-m8lbg"] Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.730060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.738399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-credential-keys\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.738885 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-fernet-keys\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.738915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-config-data\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.739012 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prk4n\" (UniqueName: \"kubernetes.io/projected/4cb0aebf-48f9-49b9-a669-141c187f6393-kube-api-access-prk4n\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.739078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-scripts\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.739167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-combined-ca-bundle\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.841811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-combined-ca-bundle\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.841873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-credential-keys\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.841918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-fernet-keys\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.841949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-config-data\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.842000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prk4n\" (UniqueName: \"kubernetes.io/projected/4cb0aebf-48f9-49b9-a669-141c187f6393-kube-api-access-prk4n\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.842048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-scripts\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.857353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-combined-ca-bundle\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.864128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-fernet-keys\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.868259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-config-data\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.868579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-scripts\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.879081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-credential-keys\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.896757 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6d6f68c7-r4j4r"] Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.898011 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.939279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prk4n\" (UniqueName: \"kubernetes.io/projected/4cb0aebf-48f9-49b9-a669-141c187f6393-kube-api-access-prk4n\") pod \"keystone-bootstrap-vl2v7\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:36 crc kubenswrapper[4749]: I0310 16:08:36.959032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6d6f68c7-r4j4r"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.049818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-svc\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.049884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-swift-storage-0\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.049906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-config\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.050075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmsp\" (UniqueName: \"kubernetes.io/projected/25ea755f-102d-437b-a255-9ef2589f9895-kube-api-access-mgmsp\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.050130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-sb\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.050163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-nb\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.056466 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ff4lh"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.057843 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.079040 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.088120 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.092960 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-q4tb7"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.097583 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qfn9s" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.099800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.118978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.119324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-48cn6" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.119502 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.133813 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ff4lh"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.180901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sg2w\" (UniqueName: \"kubernetes.io/projected/876272e9-3af8-40ba-aac7-40f8cecc909e-kube-api-access-8sg2w\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-nb\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-db-sync-config-data\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-scripts\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-svc\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-swift-storage-0\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-config\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/876272e9-3af8-40ba-aac7-40f8cecc909e-etc-machine-id\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-combined-ca-bundle\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmsp\" (UniqueName: \"kubernetes.io/projected/25ea755f-102d-437b-a255-9ef2589f9895-kube-api-access-mgmsp\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-config-data\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.181537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-sb\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.193508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-swift-storage-0\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.195827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-svc\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.198778 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q4tb7"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.204513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-sb\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.206031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.207528 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-config\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.265840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-nb\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.296629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/876272e9-3af8-40ba-aac7-40f8cecc909e-etc-machine-id\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.296770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-combined-ca-bundle\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.296850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-config-data\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.297453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/876272e9-3af8-40ba-aac7-40f8cecc909e-etc-machine-id\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.303929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sg2w\" (UniqueName: \"kubernetes.io/projected/876272e9-3af8-40ba-aac7-40f8cecc909e-kube-api-access-8sg2w\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.304056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-db-sync-config-data\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.304139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-scripts\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.304199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-config\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.312640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hndmt\" (UniqueName: \"kubernetes.io/projected/7a5d1831-eae7-4ede-a37d-158ef6140d54-kube-api-access-hndmt\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.312844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-combined-ca-bundle\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.313247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-db-sync-config-data\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.318879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-scripts\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.338650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-combined-ca-bundle\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.344515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-config-data\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.345126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmsp\" (UniqueName: \"kubernetes.io/projected/25ea755f-102d-437b-a255-9ef2589f9895-kube-api-access-mgmsp\") pod \"dnsmasq-dns-d6d6f68c7-r4j4r\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.354179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sg2w\" (UniqueName: \"kubernetes.io/projected/876272e9-3af8-40ba-aac7-40f8cecc909e-kube-api-access-8sg2w\") pod \"cinder-db-sync-ff4lh\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.385389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.401285 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-tbtx4"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.403040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.407823 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.407994 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s4t7x" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.411671 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.419247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-config\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.419301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hndmt\" (UniqueName: \"kubernetes.io/projected/7a5d1831-eae7-4ede-a37d-158ef6140d54-kube-api-access-hndmt\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.419348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-combined-ca-bundle\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.442654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-config\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.443882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-combined-ca-bundle\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.457180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tbtx4"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.459363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hndmt\" (UniqueName: \"kubernetes.io/projected/7a5d1831-eae7-4ede-a37d-158ef6140d54-kube-api-access-hndmt\") pod \"neutron-db-sync-q4tb7\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.504434 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gwdl7"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.505538 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.513246 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.513560 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zjqk6" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.520901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-logs\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.520991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-config-data\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.521031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gk2\" (UniqueName: \"kubernetes.io/projected/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-kube-api-access-27gk2\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.521094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-combined-ca-bundle\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.521119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-scripts\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.527263 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.529992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.530747 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.542070 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.542275 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.553503 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.582219 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gwdl7"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqh5l\" (UniqueName: \"kubernetes.io/projected/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-kube-api-access-mqh5l\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-scripts\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-combined-ca-bundle\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-config-data\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-logs\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-db-sync-config-data\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-config-data\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gk2\" (UniqueName: \"kubernetes.io/projected/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-kube-api-access-27gk2\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-combined-ca-bundle\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-scripts\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-run-httpd\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6w4s\" (UniqueName: \"kubernetes.io/projected/3fea9311-0e47-4352-8d4f-ac90db816fc1-kube-api-access-m6w4s\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.628906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-log-httpd\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.629449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-logs\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.635694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-config-data\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.638609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-combined-ca-bundle\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.640284 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-scripts\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.659422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gk2\" (UniqueName: \"kubernetes.io/projected/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-kube-api-access-27gk2\") pod \"placement-db-sync-tbtx4\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.688765 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6d6f68c7-r4j4r"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.688815 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.688828 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-gx2n2"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.690251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-gx2n2"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.690332 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqh5l\" (UniqueName: \"kubernetes.io/projected/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-kube-api-access-mqh5l\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-scripts\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-combined-ca-bundle\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731602 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-config-data\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-db-sync-config-data\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhgx4\" (UniqueName: \"kubernetes.io/projected/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-kube-api-access-qhgx4\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.731977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-swift-storage-0\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.732139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-run-httpd\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.732260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-svc\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.732351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6w4s\" (UniqueName: \"kubernetes.io/projected/3fea9311-0e47-4352-8d4f-ac90db816fc1-kube-api-access-m6w4s\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.732522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-log-httpd\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.732632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-config\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.732743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.737176 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.740101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.744744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-run-httpd\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.745985 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tq4t4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.746248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-combined-ca-bundle\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.746570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-log-httpd\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.746782 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.749467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-config-data\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.754354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.754684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.755199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.755814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-scripts\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.758391 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.765137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-db-sync-config-data\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.778072 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqh5l\" (UniqueName: \"kubernetes.io/projected/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-kube-api-access-mqh5l\") pod \"barbican-db-sync-gwdl7\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.780496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6w4s\" (UniqueName: \"kubernetes.io/projected/3fea9311-0e47-4352-8d4f-ac90db816fc1-kube-api-access-m6w4s\") pod \"ceilometer-0\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " pod="openstack/ceilometer-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-logs\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84tj\" (UniqueName: \"kubernetes.io/projected/d3a9f91e-926e-4593-becb-69b5a9e7963a-kube-api-access-j84tj\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhgx4\" (UniqueName: \"kubernetes.io/projected/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-kube-api-access-qhgx4\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-swift-storage-0\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-svc\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-config\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.851939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.852823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.853350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-config\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.853866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.853893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-svc\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.855590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-swift-storage-0\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.877027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhgx4\" (UniqueName: \"kubernetes.io/projected/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-kube-api-access-qhgx4\") pod \"dnsmasq-dns-6478444fbc-gx2n2\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.909322 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-m8lbg"] Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.914116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tbtx4" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-logs\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j84tj\" (UniqueName: \"kubernetes.io/projected/d3a9f91e-926e-4593-becb-69b5a9e7963a-kube-api-access-j84tj\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.953858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.954100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-logs\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.954989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.957071 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.957341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.959557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.963890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.964300 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.983336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84tj\" (UniqueName: \"kubernetes.io/projected/d3a9f91e-926e-4593-becb-69b5a9e7963a-kube-api-access-j84tj\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:37 crc kubenswrapper[4749]: I0310 16:08:37.991640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.028460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.040858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.091075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.152047 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.154775 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.157822 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.200662 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vl2v7"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.227529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.262587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpbmd\" (UniqueName: \"kubernetes.io/projected/10d67052-4827-4388-b0c8-c737b9837674-kube-api-access-fpbmd\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.262795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.262889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.263058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.263188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.263221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-logs\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.263269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.274458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q4tb7"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.287903 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ff4lh"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.365426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" event={"ID":"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad","Type":"ContainerStarted","Data":"c2527ee3ac8a9d991bb32e2e23b7ada73b28418aee8123ba372ad6bcb32099e3"} Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.366507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.367049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.367275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.367331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-logs\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.367365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.367498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpbmd\" (UniqueName: \"kubernetes.io/projected/10d67052-4827-4388-b0c8-c737b9837674-kube-api-access-fpbmd\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.367680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.369027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-logs\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.369530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.369564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4tb7" event={"ID":"7a5d1831-eae7-4ede-a37d-158ef6140d54","Type":"ContainerStarted","Data":"70fb6b2dc3ee0202064475af5ec26270d4dcf00cefc7f9bdf2917a79af108fb7"} Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.369733 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.372637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff4lh" event={"ID":"876272e9-3af8-40ba-aac7-40f8cecc909e","Type":"ContainerStarted","Data":"7365c2c026559dbb7d4c5393b5a1c9b63c97be4bb76f06a335d3080cfa536716"} Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.374400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.378831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.379775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vl2v7" event={"ID":"4cb0aebf-48f9-49b9-a669-141c187f6393","Type":"ContainerStarted","Data":"896da32e1cd189e3abc2287336a96f81ca5e4f2af6656f41328ba7756dbdad8d"} Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.380388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.409240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.440704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpbmd\" (UniqueName: \"kubernetes.io/projected/10d67052-4827-4388-b0c8-c737b9837674-kube-api-access-fpbmd\") pod \"glance-default-internal-api-0\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.482101 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6d6f68c7-r4j4r"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.503454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.553543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tbtx4"] Mar 10 16:08:38 crc kubenswrapper[4749]: W0310 16:08:38.609166 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4d52fc_2652_4251_afea_b3d1e39ed0f3.slice/crio-badb3cbf5eba73ad3dc9700c085e285054828a3ced8b6a032338ddd38ad6d4c3 WatchSource:0}: Error finding container badb3cbf5eba73ad3dc9700c085e285054828a3ced8b6a032338ddd38ad6d4c3: Status 404 returned error can't find the container with id badb3cbf5eba73ad3dc9700c085e285054828a3ced8b6a032338ddd38ad6d4c3 Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.669308 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gwdl7"] Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.814186 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:08:38 crc kubenswrapper[4749]: W0310 16:08:38.833007 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fea9311_0e47_4352_8d4f_ac90db816fc1.slice/crio-3960ea3070dcbafeb98063c9c59c53d1c83b53838e9d56884ff6de370ae06aaa WatchSource:0}: Error finding container 3960ea3070dcbafeb98063c9c59c53d1c83b53838e9d56884ff6de370ae06aaa: Status 404 returned error can't find the container with id 3960ea3070dcbafeb98063c9c59c53d1c83b53838e9d56884ff6de370ae06aaa Mar 10 16:08:38 crc kubenswrapper[4749]: I0310 16:08:38.903519 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-gx2n2"] Mar 10 16:08:38 crc kubenswrapper[4749]: W0310 16:08:38.923678 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b008b4c_16e9_4cff_a2e5_06e0a6936cd0.slice/crio-587e67ad90a66b1d493a5ca3cd8e8eddc18b50697548124b66650655a2b94482 WatchSource:0}: Error finding container 587e67ad90a66b1d493a5ca3cd8e8eddc18b50697548124b66650655a2b94482: Status 404 returned error can't find the container with id 587e67ad90a66b1d493a5ca3cd8e8eddc18b50697548124b66650655a2b94482 Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.055597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:39 crc kubenswrapper[4749]: W0310 16:08:39.078564 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a9f91e_926e_4593_becb_69b5a9e7963a.slice/crio-14cb98f9db202adba89ff75180f3a4aed80081310baab594c2f697549d00a4b1 WatchSource:0}: Error finding container 14cb98f9db202adba89ff75180f3a4aed80081310baab594c2f697549d00a4b1: Status 404 returned error can't find the container with id 14cb98f9db202adba89ff75180f3a4aed80081310baab594c2f697549d00a4b1 Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.331601 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:39 crc kubenswrapper[4749]: W0310 16:08:39.359695 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d67052_4827_4388_b0c8_c737b9837674.slice/crio-cfd0e8ea7bea8eadb52cac1dde34bdccdafc7dd5b060414e414b6b678f09dcc1 WatchSource:0}: Error finding container cfd0e8ea7bea8eadb52cac1dde34bdccdafc7dd5b060414e414b6b678f09dcc1: Status 404 returned error can't find the container with id cfd0e8ea7bea8eadb52cac1dde34bdccdafc7dd5b060414e414b6b678f09dcc1 Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.397391 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10d67052-4827-4388-b0c8-c737b9837674","Type":"ContainerStarted","Data":"cfd0e8ea7bea8eadb52cac1dde34bdccdafc7dd5b060414e414b6b678f09dcc1"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.402098 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" containerID="08ba65330fd892e7aa456bbe9ba32a410e9e9957926b3e19e51114d2c6a5318c" exitCode=0 Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.402217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" event={"ID":"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad","Type":"ContainerDied","Data":"08ba65330fd892e7aa456bbe9ba32a410e9e9957926b3e19e51114d2c6a5318c"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.410476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwdl7" event={"ID":"8f6aaf20-62e0-47eb-b54d-6edbdf95e770","Type":"ContainerStarted","Data":"31953bdc70f25cf2f794f2c49f79104ada253ebaa98d6e165f75d83ee398f90d"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.420590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerStarted","Data":"3960ea3070dcbafeb98063c9c59c53d1c83b53838e9d56884ff6de370ae06aaa"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.429216 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3a9f91e-926e-4593-becb-69b5a9e7963a","Type":"ContainerStarted","Data":"14cb98f9db202adba89ff75180f3a4aed80081310baab594c2f697549d00a4b1"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.459247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4tb7" event={"ID":"7a5d1831-eae7-4ede-a37d-158ef6140d54","Type":"ContainerStarted","Data":"697164b773efd20125629e5f2071a6d2299e396b5c4faf38e683d9df3cdd7620"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.477904 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerID="1da76019397177047d8a7f49c25186d89be470dc084537b7618a4374933a58fe" exitCode=0 Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.478033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" event={"ID":"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0","Type":"ContainerDied","Data":"1da76019397177047d8a7f49c25186d89be470dc084537b7618a4374933a58fe"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.478089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" event={"ID":"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0","Type":"ContainerStarted","Data":"587e67ad90a66b1d493a5ca3cd8e8eddc18b50697548124b66650655a2b94482"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.484041 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-q4tb7" podStartSLOduration=3.484018632 podStartE2EDuration="3.484018632s" podCreationTimestamp="2026-03-10 16:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:39.477842251 +0000 UTC m=+1216.599707938" watchObservedRunningTime="2026-03-10 16:08:39.484018632 +0000 UTC m=+1216.605884319" Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.496722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vl2v7" event={"ID":"4cb0aebf-48f9-49b9-a669-141c187f6393","Type":"ContainerStarted","Data":"3ffe7e3606b9a1d0ae05453dacf818a20fdcf617da5f2909f4b18887e63f4394"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.522007 4749 generic.go:334] "Generic (PLEG): container finished" podID="25ea755f-102d-437b-a255-9ef2589f9895" containerID="22b60c1c9090bcab82bac832367a391ca8a8c149a1687f4c90c8c43bba5c258e" exitCode=0 Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.522095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" event={"ID":"25ea755f-102d-437b-a255-9ef2589f9895","Type":"ContainerDied","Data":"22b60c1c9090bcab82bac832367a391ca8a8c149a1687f4c90c8c43bba5c258e"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.522123 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" event={"ID":"25ea755f-102d-437b-a255-9ef2589f9895","Type":"ContainerStarted","Data":"62c3b4f04db21064589b61df249c11d8a0c80ce85f24ab672bd7629ed2c6c5d0"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.548722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tbtx4" event={"ID":"5d4d52fc-2652-4251-afea-b3d1e39ed0f3","Type":"ContainerStarted","Data":"badb3cbf5eba73ad3dc9700c085e285054828a3ced8b6a032338ddd38ad6d4c3"} Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.573135 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vl2v7" podStartSLOduration=3.573110177 podStartE2EDuration="3.573110177s" podCreationTimestamp="2026-03-10 16:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:39.537246775 +0000 UTC m=+1216.659112472" watchObservedRunningTime="2026-03-10 16:08:39.573110177 +0000 UTC m=+1216.694975864" Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.655492 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.773397 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:08:39 crc kubenswrapper[4749]: I0310 16:08:39.885261 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.270875 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.391187 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.397856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-config\") pod \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.398175 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwwmz\" (UniqueName: \"kubernetes.io/projected/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-kube-api-access-hwwmz\") pod \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.398240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-swift-storage-0\") pod \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.398609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-sb\") pod \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.399068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-nb\") pod \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.399116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-svc\") pod \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\" (UID: \"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.408229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-kube-api-access-hwwmz" (OuterVolumeSpecName: "kube-api-access-hwwmz") pod "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" (UID: "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad"). InnerVolumeSpecName "kube-api-access-hwwmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.457146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" (UID: "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.464656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" (UID: "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.469948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" (UID: "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.501907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-sb\") pod \"25ea755f-102d-437b-a255-9ef2589f9895\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.501955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-svc\") pod \"25ea755f-102d-437b-a255-9ef2589f9895\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-nb\") pod \"25ea755f-102d-437b-a255-9ef2589f9895\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-swift-storage-0\") pod \"25ea755f-102d-437b-a255-9ef2589f9895\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-config\") pod \"25ea755f-102d-437b-a255-9ef2589f9895\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmsp\" (UniqueName: \"kubernetes.io/projected/25ea755f-102d-437b-a255-9ef2589f9895-kube-api-access-mgmsp\") pod \"25ea755f-102d-437b-a255-9ef2589f9895\" (UID: \"25ea755f-102d-437b-a255-9ef2589f9895\") " Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502776 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502794 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502803 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwwmz\" (UniqueName: \"kubernetes.io/projected/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-kube-api-access-hwwmz\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.502813 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.505181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" (UID: "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.505355 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-config" (OuterVolumeSpecName: "config") pod "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" (UID: "f6ab4479-94cf-4399-b8ef-0c4f688ec4ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.535175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ea755f-102d-437b-a255-9ef2589f9895-kube-api-access-mgmsp" (OuterVolumeSpecName: "kube-api-access-mgmsp") pod "25ea755f-102d-437b-a255-9ef2589f9895" (UID: "25ea755f-102d-437b-a255-9ef2589f9895"). InnerVolumeSpecName "kube-api-access-mgmsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.535856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25ea755f-102d-437b-a255-9ef2589f9895" (UID: "25ea755f-102d-437b-a255-9ef2589f9895"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.590944 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25ea755f-102d-437b-a255-9ef2589f9895" (UID: "25ea755f-102d-437b-a255-9ef2589f9895"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.596369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25ea755f-102d-437b-a255-9ef2589f9895" (UID: "25ea755f-102d-437b-a255-9ef2589f9895"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.605977 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.606017 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.606026 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.606036 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.606045 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgmsp\" (UniqueName: \"kubernetes.io/projected/25ea755f-102d-437b-a255-9ef2589f9895-kube-api-access-mgmsp\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.606053 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.649134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-config" (OuterVolumeSpecName: "config") pod "25ea755f-102d-437b-a255-9ef2589f9895" (UID: "25ea755f-102d-437b-a255-9ef2589f9895"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.661839 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25ea755f-102d-437b-a255-9ef2589f9895" (UID: "25ea755f-102d-437b-a255-9ef2589f9895"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.680126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3a9f91e-926e-4593-becb-69b5a9e7963a","Type":"ContainerStarted","Data":"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0"} Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.683511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" event={"ID":"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0","Type":"ContainerStarted","Data":"c950f119d2732837fc7212b09951b970610cc472675fbebbfa2aa0c72cf010ef"} Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.684851 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.687706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" event={"ID":"f6ab4479-94cf-4399-b8ef-0c4f688ec4ad","Type":"ContainerDied","Data":"c2527ee3ac8a9d991bb32e2e23b7ada73b28418aee8123ba372ad6bcb32099e3"} Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.688170 4749 scope.go:117] "RemoveContainer" containerID="08ba65330fd892e7aa456bbe9ba32a410e9e9957926b3e19e51114d2c6a5318c" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.688314 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679ccc59c7-m8lbg" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.695503 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.695790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6d6f68c7-r4j4r" event={"ID":"25ea755f-102d-437b-a255-9ef2589f9895","Type":"ContainerDied","Data":"62c3b4f04db21064589b61df249c11d8a0c80ce85f24ab672bd7629ed2c6c5d0"} Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.710395 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" podStartSLOduration=3.710334648 podStartE2EDuration="3.710334648s" podCreationTimestamp="2026-03-10 16:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:40.707440518 +0000 UTC m=+1217.829306215" watchObservedRunningTime="2026-03-10 16:08:40.710334648 +0000 UTC m=+1217.832200345" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.711817 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.711982 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea755f-102d-437b-a255-9ef2589f9895-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.791719 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-m8lbg"] Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.811205 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-679ccc59c7-m8lbg"] Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.847463 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6d6f68c7-r4j4r"] Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.857829 4749 scope.go:117] "RemoveContainer" containerID="22b60c1c9090bcab82bac832367a391ca8a8c149a1687f4c90c8c43bba5c258e" Mar 10 16:08:40 crc kubenswrapper[4749]: I0310 16:08:40.861584 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d6d6f68c7-r4j4r"] Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.617994 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ea755f-102d-437b-a255-9ef2589f9895" path="/var/lib/kubelet/pods/25ea755f-102d-437b-a255-9ef2589f9895/volumes" Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.619174 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" path="/var/lib/kubelet/pods/f6ab4479-94cf-4399-b8ef-0c4f688ec4ad/volumes" Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.735460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3a9f91e-926e-4593-becb-69b5a9e7963a","Type":"ContainerStarted","Data":"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e"} Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.735642 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-log" containerID="cri-o://f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0" gracePeriod=30 Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.735761 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-httpd" containerID="cri-o://c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e" gracePeriod=30 Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.741072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10d67052-4827-4388-b0c8-c737b9837674","Type":"ContainerStarted","Data":"a9a935e81812daa783edbc5172922a3da877308376e1df4704b3a278a82a4a89"} Mar 10 16:08:41 crc kubenswrapper[4749]: I0310 16:08:41.770666 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.770645783 podStartE2EDuration="5.770645783s" podCreationTimestamp="2026-03-10 16:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:41.757974602 +0000 UTC m=+1218.879840279" watchObservedRunningTime="2026-03-10 16:08:41.770645783 +0000 UTC m=+1218.892511470" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.492630 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j84tj\" (UniqueName: \"kubernetes.io/projected/d3a9f91e-926e-4593-becb-69b5a9e7963a-kube-api-access-j84tj\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-httpd-run\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-scripts\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-combined-ca-bundle\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-logs\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.560775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-config-data\") pod \"d3a9f91e-926e-4593-becb-69b5a9e7963a\" (UID: \"d3a9f91e-926e-4593-becb-69b5a9e7963a\") " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.561131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.561492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-logs" (OuterVolumeSpecName: "logs") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.562155 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.562178 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a9f91e-926e-4593-becb-69b5a9e7963a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.567651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.569082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a9f91e-926e-4593-becb-69b5a9e7963a-kube-api-access-j84tj" (OuterVolumeSpecName: "kube-api-access-j84tj") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "kube-api-access-j84tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.572408 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-scripts" (OuterVolumeSpecName: "scripts") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.594713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.636784 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-config-data" (OuterVolumeSpecName: "config-data") pod "d3a9f91e-926e-4593-becb-69b5a9e7963a" (UID: "d3a9f91e-926e-4593-becb-69b5a9e7963a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.666062 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j84tj\" (UniqueName: \"kubernetes.io/projected/d3a9f91e-926e-4593-becb-69b5a9e7963a-kube-api-access-j84tj\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.666102 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.666113 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.666123 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.666132 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a9f91e-926e-4593-becb-69b5a9e7963a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.692838 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.770325 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.778891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10d67052-4827-4388-b0c8-c737b9837674","Type":"ContainerStarted","Data":"9913bc91c37293526debbd62a2f37ebb61b588d94e4136728386b93834155f31"} Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.779057 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-log" containerID="cri-o://a9a935e81812daa783edbc5172922a3da877308376e1df4704b3a278a82a4a89" gracePeriod=30 Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.779662 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-httpd" containerID="cri-o://9913bc91c37293526debbd62a2f37ebb61b588d94e4136728386b93834155f31" gracePeriod=30 Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814167 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerID="c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e" exitCode=143 Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814210 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerID="f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0" exitCode=143 Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814249 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814399 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3a9f91e-926e-4593-becb-69b5a9e7963a","Type":"ContainerDied","Data":"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e"} Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3a9f91e-926e-4593-becb-69b5a9e7963a","Type":"ContainerDied","Data":"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0"} Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3a9f91e-926e-4593-becb-69b5a9e7963a","Type":"ContainerDied","Data":"14cb98f9db202adba89ff75180f3a4aed80081310baab594c2f697549d00a4b1"} Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.814482 4749 scope.go:117] "RemoveContainer" containerID="c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.833333 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.833308092 podStartE2EDuration="5.833308092s" podCreationTimestamp="2026-03-10 16:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:08:42.823115921 +0000 UTC m=+1219.944981608" watchObservedRunningTime="2026-03-10 16:08:42.833308092 +0000 UTC m=+1219.955173779" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.879184 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.881048 4749 scope.go:117] "RemoveContainer" containerID="f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.895304 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.922189 4749 scope.go:117] "RemoveContainer" containerID="c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e" Mar 10 16:08:42 crc kubenswrapper[4749]: E0310 16:08:42.923334 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e\": container with ID starting with c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e not found: ID does not exist" containerID="c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.923389 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e"} err="failed to get container status \"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e\": rpc error: code = NotFound desc = could not find container \"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e\": container with ID starting with c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e not found: ID does not exist" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.923422 4749 scope.go:117] "RemoveContainer" containerID="f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0" Mar 10 16:08:42 crc kubenswrapper[4749]: E0310 16:08:42.923780 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0\": container with ID starting with f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0 not found: ID does not exist" containerID="f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.923799 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0"} err="failed to get container status \"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0\": rpc error: code = NotFound desc = could not find container \"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0\": container with ID starting with f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0 not found: ID does not exist" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.923813 4749 scope.go:117] "RemoveContainer" containerID="c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.924099 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e"} err="failed to get container status \"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e\": rpc error: code = NotFound desc = could not find container \"c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e\": container with ID starting with c00254e73576603a1b9e1e7465fd8eee4293ba99a3ae1f3da57cdc9c860de57e not found: ID does not exist" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.924115 4749 scope.go:117] "RemoveContainer" containerID="f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.924535 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0"} err="failed to get container status \"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0\": rpc error: code = NotFound desc = could not find container \"f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0\": container with ID starting with f557a48da8d08f40c28a81cc0723f12d017a0a85f1438b50bc7e3c65341360e0 not found: ID does not exist" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.930476 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:42 crc kubenswrapper[4749]: E0310 16:08:42.930831 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" containerName="init" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.930844 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" containerName="init" Mar 10 16:08:42 crc kubenswrapper[4749]: E0310 16:08:42.930858 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-httpd" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.930864 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-httpd" Mar 10 16:08:42 crc kubenswrapper[4749]: E0310 16:08:42.930887 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ea755f-102d-437b-a255-9ef2589f9895" containerName="init" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.930896 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ea755f-102d-437b-a255-9ef2589f9895" containerName="init" Mar 10 16:08:42 crc kubenswrapper[4749]: E0310 16:08:42.930920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-log" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.930928 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-log" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.931087 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ea755f-102d-437b-a255-9ef2589f9895" containerName="init" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.931102 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-httpd" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.931112 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ab4479-94cf-4399-b8ef-0c4f688ec4ad" containerName="init" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.931125 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" containerName="glance-log" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.931998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.936209 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.943417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.976776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.976853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.976887 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-logs\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.976905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.976935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6xmd\" (UniqueName: \"kubernetes.io/projected/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-kube-api-access-j6xmd\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.976981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:42 crc kubenswrapper[4749]: I0310 16:08:42.977020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-logs\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6xmd\" (UniqueName: \"kubernetes.io/projected/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-kube-api-access-j6xmd\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.079868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.080297 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.080847 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.084154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-logs\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.091516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.108002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.108522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.131607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6xmd\" (UniqueName: \"kubernetes.io/projected/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-kube-api-access-j6xmd\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.203199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.262357 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.629959 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a9f91e-926e-4593-becb-69b5a9e7963a" path="/var/lib/kubelet/pods/d3a9f91e-926e-4593-becb-69b5a9e7963a/volumes" Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.829511 4749 generic.go:334] "Generic (PLEG): container finished" podID="10d67052-4827-4388-b0c8-c737b9837674" containerID="9913bc91c37293526debbd62a2f37ebb61b588d94e4136728386b93834155f31" exitCode=0 Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.829647 4749 generic.go:334] "Generic (PLEG): container finished" podID="10d67052-4827-4388-b0c8-c737b9837674" containerID="a9a935e81812daa783edbc5172922a3da877308376e1df4704b3a278a82a4a89" exitCode=143 Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.829671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10d67052-4827-4388-b0c8-c737b9837674","Type":"ContainerDied","Data":"9913bc91c37293526debbd62a2f37ebb61b588d94e4136728386b93834155f31"} Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.829707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10d67052-4827-4388-b0c8-c737b9837674","Type":"ContainerDied","Data":"a9a935e81812daa783edbc5172922a3da877308376e1df4704b3a278a82a4a89"} Mar 10 16:08:43 crc kubenswrapper[4749]: I0310 16:08:43.923972 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:44 crc kubenswrapper[4749]: I0310 16:08:44.839960 4749 generic.go:334] "Generic (PLEG): container finished" podID="4cb0aebf-48f9-49b9-a669-141c187f6393" containerID="3ffe7e3606b9a1d0ae05453dacf818a20fdcf617da5f2909f4b18887e63f4394" exitCode=0 Mar 10 16:08:44 crc kubenswrapper[4749]: I0310 16:08:44.840160 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vl2v7" event={"ID":"4cb0aebf-48f9-49b9-a669-141c187f6393","Type":"ContainerDied","Data":"3ffe7e3606b9a1d0ae05453dacf818a20fdcf617da5f2909f4b18887e63f4394"} Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.749322 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.773633 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-combined-ca-bundle\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.774080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-logs\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.774161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpbmd\" (UniqueName: \"kubernetes.io/projected/10d67052-4827-4388-b0c8-c737b9837674-kube-api-access-fpbmd\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.774195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-config-data\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.774305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-httpd-run\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.774331 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.774362 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-scripts\") pod \"10d67052-4827-4388-b0c8-c737b9837674\" (UID: \"10d67052-4827-4388-b0c8-c737b9837674\") " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.777013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.777025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-logs" (OuterVolumeSpecName: "logs") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.784413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.784510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d67052-4827-4388-b0c8-c737b9837674-kube-api-access-fpbmd" (OuterVolumeSpecName: "kube-api-access-fpbmd") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "kube-api-access-fpbmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.800715 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-scripts" (OuterVolumeSpecName: "scripts") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.810637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.833899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-config-data" (OuterVolumeSpecName: "config-data") pod "10d67052-4827-4388-b0c8-c737b9837674" (UID: "10d67052-4827-4388-b0c8-c737b9837674"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.856136 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.857978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10d67052-4827-4388-b0c8-c737b9837674","Type":"ContainerDied","Data":"cfd0e8ea7bea8eadb52cac1dde34bdccdafc7dd5b060414e414b6b678f09dcc1"} Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.858037 4749 scope.go:117] "RemoveContainer" containerID="9913bc91c37293526debbd62a2f37ebb61b588d94e4136728386b93834155f31" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877039 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877090 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877102 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877110 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877121 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10d67052-4827-4388-b0c8-c737b9837674-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877129 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpbmd\" (UniqueName: \"kubernetes.io/projected/10d67052-4827-4388-b0c8-c737b9837674-kube-api-access-fpbmd\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.877139 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d67052-4827-4388-b0c8-c737b9837674-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.898438 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.917192 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.931178 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.942548 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:45 crc kubenswrapper[4749]: E0310 16:08:45.942952 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-log" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.942964 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-log" Mar 10 16:08:45 crc kubenswrapper[4749]: E0310 16:08:45.942978 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-httpd" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.942984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-httpd" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.943131 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-httpd" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.943145 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d67052-4827-4388-b0c8-c737b9837674" containerName="glance-log" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.943966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.963453 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.972822 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:45 crc kubenswrapper[4749]: I0310 16:08:45.985772 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.099714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.099797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrv9k\" (UniqueName: \"kubernetes.io/projected/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-kube-api-access-zrv9k\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.099840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.099938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.099963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.100007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.100036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrv9k\" (UniqueName: \"kubernetes.io/projected/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-kube-api-access-zrv9k\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.203975 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.204409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.210674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.211115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.219612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.227603 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrv9k\" (UniqueName: \"kubernetes.io/projected/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-kube-api-access-zrv9k\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.232073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:08:46 crc kubenswrapper[4749]: I0310 16:08:46.311155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:08:47 crc kubenswrapper[4749]: I0310 16:08:47.634827 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d67052-4827-4388-b0c8-c737b9837674" path="/var/lib/kubelet/pods/10d67052-4827-4388-b0c8-c737b9837674/volumes" Mar 10 16:08:47 crc kubenswrapper[4749]: I0310 16:08:47.674935 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:08:47 crc kubenswrapper[4749]: I0310 16:08:47.744375 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:08:48 crc kubenswrapper[4749]: I0310 16:08:48.043953 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:08:48 crc kubenswrapper[4749]: I0310 16:08:48.113111 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-w4nhb"] Mar 10 16:08:48 crc kubenswrapper[4749]: I0310 16:08:48.113635 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="dnsmasq-dns" containerID="cri-o://255fe3e862784ea5711146921e64697383c4d78a74be22864fdd88cee5a1010a" gracePeriod=10 Mar 10 16:08:48 crc kubenswrapper[4749]: I0310 16:08:48.879803 4749 generic.go:334] "Generic (PLEG): container finished" podID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerID="255fe3e862784ea5711146921e64697383c4d78a74be22864fdd88cee5a1010a" exitCode=0 Mar 10 16:08:48 crc kubenswrapper[4749]: I0310 16:08:48.879848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" event={"ID":"3446bf6c-8fd9-491e-aee2-87c44e34c315","Type":"ContainerDied","Data":"255fe3e862784ea5711146921e64697383c4d78a74be22864fdd88cee5a1010a"} Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.364503 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Mar 10 16:08:51 crc kubenswrapper[4749]: W0310 16:08:51.604345 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89227fcb_0cbf_45c7_8b6c_9f98ec2210e1.slice/crio-141ecface0fb479bc4c6e5bd5bd48d2e2f56ee1a4c857ce6d4ee89348b1f6364 WatchSource:0}: Error finding container 141ecface0fb479bc4c6e5bd5bd48d2e2f56ee1a4c857ce6d4ee89348b1f6364: Status 404 returned error can't find the container with id 141ecface0fb479bc4c6e5bd5bd48d2e2f56ee1a4c857ce6d4ee89348b1f6364 Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.743208 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.822447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-fernet-keys\") pod \"4cb0aebf-48f9-49b9-a669-141c187f6393\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.822502 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-combined-ca-bundle\") pod \"4cb0aebf-48f9-49b9-a669-141c187f6393\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.822538 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-scripts\") pod \"4cb0aebf-48f9-49b9-a669-141c187f6393\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.822599 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-credential-keys\") pod \"4cb0aebf-48f9-49b9-a669-141c187f6393\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.822642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-config-data\") pod \"4cb0aebf-48f9-49b9-a669-141c187f6393\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.822674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prk4n\" (UniqueName: \"kubernetes.io/projected/4cb0aebf-48f9-49b9-a669-141c187f6393-kube-api-access-prk4n\") pod \"4cb0aebf-48f9-49b9-a669-141c187f6393\" (UID: \"4cb0aebf-48f9-49b9-a669-141c187f6393\") " Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.830360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4cb0aebf-48f9-49b9-a669-141c187f6393" (UID: "4cb0aebf-48f9-49b9-a669-141c187f6393"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.830520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb0aebf-48f9-49b9-a669-141c187f6393-kube-api-access-prk4n" (OuterVolumeSpecName: "kube-api-access-prk4n") pod "4cb0aebf-48f9-49b9-a669-141c187f6393" (UID: "4cb0aebf-48f9-49b9-a669-141c187f6393"). InnerVolumeSpecName "kube-api-access-prk4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.830969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-scripts" (OuterVolumeSpecName: "scripts") pod "4cb0aebf-48f9-49b9-a669-141c187f6393" (UID: "4cb0aebf-48f9-49b9-a669-141c187f6393"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.833586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4cb0aebf-48f9-49b9-a669-141c187f6393" (UID: "4cb0aebf-48f9-49b9-a669-141c187f6393"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.856548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cb0aebf-48f9-49b9-a669-141c187f6393" (UID: "4cb0aebf-48f9-49b9-a669-141c187f6393"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.865090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-config-data" (OuterVolumeSpecName: "config-data") pod "4cb0aebf-48f9-49b9-a669-141c187f6393" (UID: "4cb0aebf-48f9-49b9-a669-141c187f6393"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.926015 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.926037 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.926050 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.926060 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.926068 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb0aebf-48f9-49b9-a669-141c187f6393-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.926076 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prk4n\" (UniqueName: \"kubernetes.io/projected/4cb0aebf-48f9-49b9-a669-141c187f6393-kube-api-access-prk4n\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.931135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vl2v7" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.932017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vl2v7" event={"ID":"4cb0aebf-48f9-49b9-a669-141c187f6393","Type":"ContainerDied","Data":"896da32e1cd189e3abc2287336a96f81ca5e4f2af6656f41328ba7756dbdad8d"} Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.932127 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="896da32e1cd189e3abc2287336a96f81ca5e4f2af6656f41328ba7756dbdad8d" Mar 10 16:08:51 crc kubenswrapper[4749]: I0310 16:08:51.934549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1","Type":"ContainerStarted","Data":"141ecface0fb479bc4c6e5bd5bd48d2e2f56ee1a4c857ce6d4ee89348b1f6364"} Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.838138 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vl2v7"] Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.845761 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vl2v7"] Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.962079 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vxg2v"] Mar 10 16:08:52 crc kubenswrapper[4749]: E0310 16:08:52.962884 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb0aebf-48f9-49b9-a669-141c187f6393" containerName="keystone-bootstrap" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.962929 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb0aebf-48f9-49b9-a669-141c187f6393" containerName="keystone-bootstrap" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.963336 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb0aebf-48f9-49b9-a669-141c187f6393" containerName="keystone-bootstrap" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.964560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.966865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.966877 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.967270 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ppc5j" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.967472 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.968103 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 16:08:52 crc kubenswrapper[4749]: I0310 16:08:52.972693 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vxg2v"] Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.055189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-credential-keys\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.055301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bkzc\" (UniqueName: \"kubernetes.io/projected/5dbeb14b-95ab-438e-b3e5-be66a6c34188-kube-api-access-7bkzc\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.055471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-fernet-keys\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.055515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-scripts\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.055556 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-config-data\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.055609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-combined-ca-bundle\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.157246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-fernet-keys\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.157323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-scripts\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.157367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-config-data\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.157509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-combined-ca-bundle\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.157617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-credential-keys\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.157662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bkzc\" (UniqueName: \"kubernetes.io/projected/5dbeb14b-95ab-438e-b3e5-be66a6c34188-kube-api-access-7bkzc\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.163252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-fernet-keys\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.164260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-scripts\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.172985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-config-data\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.177932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-credential-keys\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.182053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-combined-ca-bundle\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.186679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bkzc\" (UniqueName: \"kubernetes.io/projected/5dbeb14b-95ab-438e-b3e5-be66a6c34188-kube-api-access-7bkzc\") pod \"keystone-bootstrap-vxg2v\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.292426 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:08:53 crc kubenswrapper[4749]: I0310 16:08:53.627258 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb0aebf-48f9-49b9-a669-141c187f6393" path="/var/lib/kubelet/pods/4cb0aebf-48f9-49b9-a669-141c187f6393/volumes" Mar 10 16:08:54 crc kubenswrapper[4749]: E0310 16:08:54.213046 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 10 16:08:54 crc kubenswrapper[4749]: E0310 16:08:54.213657 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqh5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gwdl7_openstack(8f6aaf20-62e0-47eb-b54d-6edbdf95e770): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 16:08:54 crc kubenswrapper[4749]: E0310 16:08:54.215164 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gwdl7" podUID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.280064 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.482594 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-swift-storage-0\") pod \"3446bf6c-8fd9-491e-aee2-87c44e34c315\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.482746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jfmd\" (UniqueName: \"kubernetes.io/projected/3446bf6c-8fd9-491e-aee2-87c44e34c315-kube-api-access-2jfmd\") pod \"3446bf6c-8fd9-491e-aee2-87c44e34c315\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.482780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-config\") pod \"3446bf6c-8fd9-491e-aee2-87c44e34c315\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.482903 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-svc\") pod \"3446bf6c-8fd9-491e-aee2-87c44e34c315\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.482951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-nb\") pod \"3446bf6c-8fd9-491e-aee2-87c44e34c315\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.483012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-sb\") pod \"3446bf6c-8fd9-491e-aee2-87c44e34c315\" (UID: \"3446bf6c-8fd9-491e-aee2-87c44e34c315\") " Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.495976 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3446bf6c-8fd9-491e-aee2-87c44e34c315-kube-api-access-2jfmd" (OuterVolumeSpecName: "kube-api-access-2jfmd") pod "3446bf6c-8fd9-491e-aee2-87c44e34c315" (UID: "3446bf6c-8fd9-491e-aee2-87c44e34c315"). InnerVolumeSpecName "kube-api-access-2jfmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.533574 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3446bf6c-8fd9-491e-aee2-87c44e34c315" (UID: "3446bf6c-8fd9-491e-aee2-87c44e34c315"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.535671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3446bf6c-8fd9-491e-aee2-87c44e34c315" (UID: "3446bf6c-8fd9-491e-aee2-87c44e34c315"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.536800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-config" (OuterVolumeSpecName: "config") pod "3446bf6c-8fd9-491e-aee2-87c44e34c315" (UID: "3446bf6c-8fd9-491e-aee2-87c44e34c315"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.539846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3446bf6c-8fd9-491e-aee2-87c44e34c315" (UID: "3446bf6c-8fd9-491e-aee2-87c44e34c315"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.556229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3446bf6c-8fd9-491e-aee2-87c44e34c315" (UID: "3446bf6c-8fd9-491e-aee2-87c44e34c315"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.586697 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jfmd\" (UniqueName: \"kubernetes.io/projected/3446bf6c-8fd9-491e-aee2-87c44e34c315-kube-api-access-2jfmd\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.586744 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.586757 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.586768 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.586783 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.586795 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3446bf6c-8fd9-491e-aee2-87c44e34c315-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.976873 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" Mar 10 16:08:54 crc kubenswrapper[4749]: I0310 16:08:54.976930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f8d8dcc-w4nhb" event={"ID":"3446bf6c-8fd9-491e-aee2-87c44e34c315","Type":"ContainerDied","Data":"8899f1cc41262a45c5fb09e083cd8b618e6dfe8f46274646946d31b192b3f027"} Mar 10 16:08:54 crc kubenswrapper[4749]: E0310 16:08:54.978006 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-gwdl7" podUID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" Mar 10 16:08:55 crc kubenswrapper[4749]: I0310 16:08:55.022783 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-w4nhb"] Mar 10 16:08:55 crc kubenswrapper[4749]: I0310 16:08:55.039196 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f8d8dcc-w4nhb"] Mar 10 16:08:55 crc kubenswrapper[4749]: I0310 16:08:55.621518 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" path="/var/lib/kubelet/pods/3446bf6c-8fd9-491e-aee2-87c44e34c315/volumes" Mar 10 16:09:01 crc kubenswrapper[4749]: I0310 16:09:01.266055 4749 scope.go:117] "RemoveContainer" containerID="a9a935e81812daa783edbc5172922a3da877308376e1df4704b3a278a82a4a89" Mar 10 16:09:02 crc kubenswrapper[4749]: I0310 16:09:02.043735 4749 generic.go:334] "Generic (PLEG): container finished" podID="7a5d1831-eae7-4ede-a37d-158ef6140d54" containerID="697164b773efd20125629e5f2071a6d2299e396b5c4faf38e683d9df3cdd7620" exitCode=0 Mar 10 16:09:02 crc kubenswrapper[4749]: I0310 16:09:02.043914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4tb7" event={"ID":"7a5d1831-eae7-4ede-a37d-158ef6140d54","Type":"ContainerDied","Data":"697164b773efd20125629e5f2071a6d2299e396b5c4faf38e683d9df3cdd7620"} Mar 10 16:09:02 crc kubenswrapper[4749]: I0310 16:09:02.368638 4749 scope.go:117] "RemoveContainer" containerID="255fe3e862784ea5711146921e64697383c4d78a74be22864fdd88cee5a1010a" Mar 10 16:09:02 crc kubenswrapper[4749]: E0310 16:09:02.382707 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 10 16:09:02 crc kubenswrapper[4749]: E0310 16:09:02.382910 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sg2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ff4lh_openstack(876272e9-3af8-40ba-aac7-40f8cecc909e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 16:09:02 crc kubenswrapper[4749]: E0310 16:09:02.385667 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ff4lh" podUID="876272e9-3af8-40ba-aac7-40f8cecc909e" Mar 10 16:09:02 crc kubenswrapper[4749]: I0310 16:09:02.476834 4749 scope.go:117] "RemoveContainer" containerID="abf39b43e1b7826151e86c6bb1f90364c93edd1de0334b34bcca2d5d93db8d3a" Mar 10 16:09:02 crc kubenswrapper[4749]: I0310 16:09:02.853199 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vxg2v"] Mar 10 16:09:02 crc kubenswrapper[4749]: W0310 16:09:02.882491 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dbeb14b_95ab_438e_b3e5_be66a6c34188.slice/crio-d6367073ae8bc849e0cfdc1245440a8158efc19856c85299de3538dff1ae44c1 WatchSource:0}: Error finding container d6367073ae8bc849e0cfdc1245440a8158efc19856c85299de3538dff1ae44c1: Status 404 returned error can't find the container with id d6367073ae8bc849e0cfdc1245440a8158efc19856c85299de3538dff1ae44c1 Mar 10 16:09:02 crc kubenswrapper[4749]: I0310 16:09:02.944849 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.063998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1","Type":"ContainerStarted","Data":"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861"} Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.068427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e","Type":"ContainerStarted","Data":"baa62f6e5323111cc60431c469af76c9897b3d8213a45afbbc2b7a176bdd6a98"} Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.076553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxg2v" event={"ID":"5dbeb14b-95ab-438e-b3e5-be66a6c34188","Type":"ContainerStarted","Data":"d6367073ae8bc849e0cfdc1245440a8158efc19856c85299de3538dff1ae44c1"} Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.080515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerStarted","Data":"ed18f1b882ca0e625ea1d9e295570112cd67ff4379d310dfc0b3721c89d6483d"} Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.084792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tbtx4" event={"ID":"5d4d52fc-2652-4251-afea-b3d1e39ed0f3","Type":"ContainerStarted","Data":"7d1730370c1d50f08602e88fdd860a08bffb34493d3059343bab04e49ffcd929"} Mar 10 16:09:03 crc kubenswrapper[4749]: E0310 16:09:03.086532 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-ff4lh" podUID="876272e9-3af8-40ba-aac7-40f8cecc909e" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.111525 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-tbtx4" podStartSLOduration=2.380749287 podStartE2EDuration="26.11149896s" podCreationTimestamp="2026-03-10 16:08:37 +0000 UTC" firstStartedPulling="2026-03-10 16:08:38.624334968 +0000 UTC m=+1215.746200655" lastFinishedPulling="2026-03-10 16:09:02.355084641 +0000 UTC m=+1239.476950328" observedRunningTime="2026-03-10 16:09:03.100042884 +0000 UTC m=+1240.221908571" watchObservedRunningTime="2026-03-10 16:09:03.11149896 +0000 UTC m=+1240.233364657" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.338110 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.359792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-combined-ca-bundle\") pod \"7a5d1831-eae7-4ede-a37d-158ef6140d54\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.359951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-config\") pod \"7a5d1831-eae7-4ede-a37d-158ef6140d54\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.360042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hndmt\" (UniqueName: \"kubernetes.io/projected/7a5d1831-eae7-4ede-a37d-158ef6140d54-kube-api-access-hndmt\") pod \"7a5d1831-eae7-4ede-a37d-158ef6140d54\" (UID: \"7a5d1831-eae7-4ede-a37d-158ef6140d54\") " Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.368060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5d1831-eae7-4ede-a37d-158ef6140d54-kube-api-access-hndmt" (OuterVolumeSpecName: "kube-api-access-hndmt") pod "7a5d1831-eae7-4ede-a37d-158ef6140d54" (UID: "7a5d1831-eae7-4ede-a37d-158ef6140d54"). InnerVolumeSpecName "kube-api-access-hndmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.423185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a5d1831-eae7-4ede-a37d-158ef6140d54" (UID: "7a5d1831-eae7-4ede-a37d-158ef6140d54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.435746 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-config" (OuterVolumeSpecName: "config") pod "7a5d1831-eae7-4ede-a37d-158ef6140d54" (UID: "7a5d1831-eae7-4ede-a37d-158ef6140d54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.462658 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.462800 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hndmt\" (UniqueName: \"kubernetes.io/projected/7a5d1831-eae7-4ede-a37d-158ef6140d54-kube-api-access-hndmt\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:03 crc kubenswrapper[4749]: I0310 16:09:03.462877 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d1831-eae7-4ede-a37d-158ef6140d54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.133772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1","Type":"ContainerStarted","Data":"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601"} Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.134538 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-log" containerID="cri-o://d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861" gracePeriod=30 Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.134756 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-httpd" containerID="cri-o://8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601" gracePeriod=30 Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.139228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e","Type":"ContainerStarted","Data":"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c"} Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.143125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxg2v" event={"ID":"5dbeb14b-95ab-438e-b3e5-be66a6c34188","Type":"ContainerStarted","Data":"71646d3d80a269ec395cc73be3f051a86dc38c361242cd8c53640573173be447"} Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.146574 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q4tb7" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.147660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q4tb7" event={"ID":"7a5d1831-eae7-4ede-a37d-158ef6140d54","Type":"ContainerDied","Data":"70fb6b2dc3ee0202064475af5ec26270d4dcf00cefc7f9bdf2917a79af108fb7"} Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.147718 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70fb6b2dc3ee0202064475af5ec26270d4dcf00cefc7f9bdf2917a79af108fb7" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.159043 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=22.159020532 podStartE2EDuration="22.159020532s" podCreationTimestamp="2026-03-10 16:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:04.155809674 +0000 UTC m=+1241.277675371" watchObservedRunningTime="2026-03-10 16:09:04.159020532 +0000 UTC m=+1241.280886219" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.194712 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vxg2v" podStartSLOduration=12.194682955 podStartE2EDuration="12.194682955s" podCreationTimestamp="2026-03-10 16:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:04.182418167 +0000 UTC m=+1241.304283844" watchObservedRunningTime="2026-03-10 16:09:04.194682955 +0000 UTC m=+1241.316548642" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.308095 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-tnqz6"] Mar 10 16:09:04 crc kubenswrapper[4749]: E0310 16:09:04.309010 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="init" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.309029 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="init" Mar 10 16:09:04 crc kubenswrapper[4749]: E0310 16:09:04.309054 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="dnsmasq-dns" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.309061 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="dnsmasq-dns" Mar 10 16:09:04 crc kubenswrapper[4749]: E0310 16:09:04.309077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5d1831-eae7-4ede-a37d-158ef6140d54" containerName="neutron-db-sync" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.309085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5d1831-eae7-4ede-a37d-158ef6140d54" containerName="neutron-db-sync" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.309240 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3446bf6c-8fd9-491e-aee2-87c44e34c315" containerName="dnsmasq-dns" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.309250 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5d1831-eae7-4ede-a37d-158ef6140d54" containerName="neutron-db-sync" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.310049 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.384830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-swift-storage-0\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.386171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-svc\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.387523 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-sb\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.387711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-config\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.387911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-nb\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.388068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5nxq\" (UniqueName: \"kubernetes.io/projected/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-kube-api-access-j5nxq\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.389698 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-tnqz6"] Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.489055 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d596895b8-zh48t"] Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.490565 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.492901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-swift-storage-0\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.492963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-svc\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.493000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-sb\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.493034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-config\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.493059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-nb\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.493122 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5nxq\" (UniqueName: \"kubernetes.io/projected/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-kube-api-access-j5nxq\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.494399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-sb\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.494667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-config\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.494766 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-48cn6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.494806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.494993 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.495002 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.495309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-nb\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.495778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-svc\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.495900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-swift-storage-0\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.526085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5nxq\" (UniqueName: \"kubernetes.io/projected/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-kube-api-access-j5nxq\") pod \"dnsmasq-dns-5864dc4585-tnqz6\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.531463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d596895b8-zh48t"] Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.594482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-httpd-config\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.594559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zbm\" (UniqueName: \"kubernetes.io/projected/0893ff76-efa3-496c-b499-0f6e3a4ffd59-kube-api-access-k7zbm\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.594594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-combined-ca-bundle\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.594640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-ovndb-tls-certs\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.594693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-config\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.696205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-httpd-config\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.696301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zbm\" (UniqueName: \"kubernetes.io/projected/0893ff76-efa3-496c-b499-0f6e3a4ffd59-kube-api-access-k7zbm\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.696331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-combined-ca-bundle\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.696414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-ovndb-tls-certs\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.696473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-config\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.716398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-httpd-config\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.720464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-combined-ca-bundle\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.720567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-config\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.721163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-ovndb-tls-certs\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.724512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zbm\" (UniqueName: \"kubernetes.io/projected/0893ff76-efa3-496c-b499-0f6e3a4ffd59-kube-api-access-k7zbm\") pod \"neutron-7d596895b8-zh48t\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.754067 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.831790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:04 crc kubenswrapper[4749]: I0310 16:09:04.966052 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-combined-ca-bundle\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-config-data\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-scripts\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001397 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-logs\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6xmd\" (UniqueName: \"kubernetes.io/projected/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-kube-api-access-j6xmd\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.001697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-httpd-run\") pod \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\" (UID: \"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.002442 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.005633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-logs" (OuterVolumeSpecName: "logs") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.010351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.011535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-kube-api-access-j6xmd" (OuterVolumeSpecName: "kube-api-access-j6xmd") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "kube-api-access-j6xmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.015509 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-scripts" (OuterVolumeSpecName: "scripts") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.079827 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-config-data" (OuterVolumeSpecName: "config-data") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.091100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" (UID: "89227fcb-0cbf-45c7-8b6c-9f98ec2210e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106154 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106223 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6xmd\" (UniqueName: \"kubernetes.io/projected/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-kube-api-access-j6xmd\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106241 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106281 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106355 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106447 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.106465 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.123881 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.167322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e","Type":"ContainerStarted","Data":"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2"} Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.167526 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-log" containerID="cri-o://d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c" gracePeriod=30 Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.167624 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-httpd" containerID="cri-o://14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2" gracePeriod=30 Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170448 4749 generic.go:334] "Generic (PLEG): container finished" podID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerID="8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601" exitCode=0 Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170481 4749 generic.go:334] "Generic (PLEG): container finished" podID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerID="d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861" exitCode=143 Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170680 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1","Type":"ContainerDied","Data":"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601"} Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1","Type":"ContainerDied","Data":"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861"} Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89227fcb-0cbf-45c7-8b6c-9f98ec2210e1","Type":"ContainerDied","Data":"141ecface0fb479bc4c6e5bd5bd48d2e2f56ee1a4c857ce6d4ee89348b1f6364"} Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.170880 4749 scope.go:117] "RemoveContainer" containerID="8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.202467 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=20.202445791 podStartE2EDuration="20.202445791s" podCreationTimestamp="2026-03-10 16:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:05.191896021 +0000 UTC m=+1242.313761718" watchObservedRunningTime="2026-03-10 16:09:05.202445791 +0000 UTC m=+1242.324311478" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.208210 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.246694 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.252484 4749 scope.go:117] "RemoveContainer" containerID="d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.268466 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.287078 4749 scope.go:117] "RemoveContainer" containerID="8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.294260 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:09:05 crc kubenswrapper[4749]: E0310 16:09:05.294966 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-log" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.295005 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-log" Mar 10 16:09:05 crc kubenswrapper[4749]: E0310 16:09:05.295037 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-httpd" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.295047 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-httpd" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.295337 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-log" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.295419 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" containerName="glance-httpd" Mar 10 16:09:05 crc kubenswrapper[4749]: E0310 16:09:05.296321 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601\": container with ID starting with 8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601 not found: ID does not exist" containerID="8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.296388 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601"} err="failed to get container status \"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601\": rpc error: code = NotFound desc = could not find container \"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601\": container with ID starting with 8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601 not found: ID does not exist" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.296417 4749 scope.go:117] "RemoveContainer" containerID="d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.297537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: E0310 16:09:05.303943 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861\": container with ID starting with d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861 not found: ID does not exist" containerID="d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.303991 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861"} err="failed to get container status \"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861\": rpc error: code = NotFound desc = could not find container \"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861\": container with ID starting with d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861 not found: ID does not exist" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.304021 4749 scope.go:117] "RemoveContainer" containerID="8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.304582 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.304755 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.304884 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601"} err="failed to get container status \"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601\": rpc error: code = NotFound desc = could not find container \"8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601\": container with ID starting with 8c5ff717fc39f160d98dec73be183be15994be9d4556b12f8886a0250befa601 not found: ID does not exist" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.304926 4749 scope.go:117] "RemoveContainer" containerID="d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.307796 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861"} err="failed to get container status \"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861\": rpc error: code = NotFound desc = could not find container \"d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861\": container with ID starting with d7b547c183072ce4b186319db929b3600e69e82a564afb6098322d2c1d3d2861 not found: ID does not exist" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.316425 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.363101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-tnqz6"] Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.414796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-logs\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvrhn\" (UniqueName: \"kubernetes.io/projected/f83d36e6-e860-47a3-8590-d0a468a8819a-kube-api-access-bvrhn\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.415664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.518029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.519601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.519728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.520233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-logs\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.520272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.520316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.520414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.520481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvrhn\" (UniqueName: \"kubernetes.io/projected/f83d36e6-e860-47a3-8590-d0a468a8819a-kube-api-access-bvrhn\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.521890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.523222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-logs\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.523446 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.526928 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.527703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.534064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.536666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvrhn\" (UniqueName: \"kubernetes.io/projected/f83d36e6-e860-47a3-8590-d0a468a8819a-kube-api-access-bvrhn\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.548172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-scripts\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.559028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.634481 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.636752 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89227fcb-0cbf-45c7-8b6c-9f98ec2210e1" path="/var/lib/kubelet/pods/89227fcb-0cbf-45c7-8b6c-9f98ec2210e1/volumes" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.759696 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.934600 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-combined-ca-bundle\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.938625 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-httpd-run\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.939551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.938915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-config-data\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.940049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.940803 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-logs" (OuterVolumeSpecName: "logs") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.940116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-logs\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.941081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-scripts\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.944354 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.948033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrv9k\" (UniqueName: \"kubernetes.io/projected/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-kube-api-access-zrv9k\") pod \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\" (UID: \"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e\") " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.957744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-scripts" (OuterVolumeSpecName: "scripts") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.958514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-kube-api-access-zrv9k" (OuterVolumeSpecName: "kube-api-access-zrv9k") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "kube-api-access-zrv9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.959583 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.959662 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.960269 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.960285 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.960299 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrv9k\" (UniqueName: \"kubernetes.io/projected/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-kube-api-access-zrv9k\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.973902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:05 crc kubenswrapper[4749]: I0310 16:09:05.988005 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.022631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-config-data" (OuterVolumeSpecName: "config-data") pod "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" (UID: "3d9fa127-4985-41bf-a8e3-2702dc5f8d2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.062582 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.062620 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.062634 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.181528 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerID="0f457c92d0e63c17ba27eb4cb2bcf081fc93033f435b7a6f34fadb128d1be857" exitCode=0 Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.181623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" event={"ID":"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf","Type":"ContainerDied","Data":"0f457c92d0e63c17ba27eb4cb2bcf081fc93033f435b7a6f34fadb128d1be857"} Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.181655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" event={"ID":"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf","Type":"ContainerStarted","Data":"1cd7e4ec28667d6a965395522a1c8413b47d90fe7c88adb6ccc025da63908169"} Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.185279 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d4d52fc-2652-4251-afea-b3d1e39ed0f3" containerID="7d1730370c1d50f08602e88fdd860a08bffb34493d3059343bab04e49ffcd929" exitCode=0 Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.185409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tbtx4" event={"ID":"5d4d52fc-2652-4251-afea-b3d1e39ed0f3","Type":"ContainerDied","Data":"7d1730370c1d50f08602e88fdd860a08bffb34493d3059343bab04e49ffcd929"} Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192551 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerID="14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2" exitCode=0 Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192595 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerID="d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c" exitCode=143 Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e","Type":"ContainerDied","Data":"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2"} Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e","Type":"ContainerDied","Data":"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c"} Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d9fa127-4985-41bf-a8e3-2702dc5f8d2e","Type":"ContainerDied","Data":"baa62f6e5323111cc60431c469af76c9897b3d8213a45afbbc2b7a176bdd6a98"} Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.192729 4749 scope.go:117] "RemoveContainer" containerID="14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.260991 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.264851 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.279081 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:09:06 crc kubenswrapper[4749]: E0310 16:09:06.279602 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-log" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.279628 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-log" Mar 10 16:09:06 crc kubenswrapper[4749]: E0310 16:09:06.279657 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-httpd" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.279668 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-httpd" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.280359 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-log" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.280411 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" containerName="glance-httpd" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.282816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.288183 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.288821 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.295893 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.346960 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pln\" (UniqueName: \"kubernetes.io/projected/87889224-54c2-4883-97e8-20e5ad3a8f8b-kube-api-access-d8pln\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-logs\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.368961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.390904 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d596895b8-zh48t"] Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.470874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.470960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.470997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.471021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.471046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pln\" (UniqueName: \"kubernetes.io/projected/87889224-54c2-4883-97e8-20e5ad3a8f8b-kube-api-access-d8pln\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.471102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-logs\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.471151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.471205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.471974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.473200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-logs\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.473494 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.478108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.478328 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.482935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.492961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pln\" (UniqueName: \"kubernetes.io/projected/87889224-54c2-4883-97e8-20e5ad3a8f8b-kube-api-access-d8pln\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.494057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.501953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.600967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.861874 4749 scope.go:117] "RemoveContainer" containerID="d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.991043 4749 scope.go:117] "RemoveContainer" containerID="14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2" Mar 10 16:09:06 crc kubenswrapper[4749]: E0310 16:09:06.992267 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2\": container with ID starting with 14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2 not found: ID does not exist" containerID="14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.992300 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2"} err="failed to get container status \"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2\": rpc error: code = NotFound desc = could not find container \"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2\": container with ID starting with 14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2 not found: ID does not exist" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.992335 4749 scope.go:117] "RemoveContainer" containerID="d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c" Mar 10 16:09:06 crc kubenswrapper[4749]: E0310 16:09:06.993547 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c\": container with ID starting with d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c not found: ID does not exist" containerID="d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.993573 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c"} err="failed to get container status \"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c\": rpc error: code = NotFound desc = could not find container \"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c\": container with ID starting with d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c not found: ID does not exist" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.993589 4749 scope.go:117] "RemoveContainer" containerID="14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.994320 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2"} err="failed to get container status \"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2\": rpc error: code = NotFound desc = could not find container \"14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2\": container with ID starting with 14de144936f686a1002a53dcf00d2e04bc97a9f6cd9ae2a324b994ff34de43b2 not found: ID does not exist" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.994361 4749 scope.go:117] "RemoveContainer" containerID="d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c" Mar 10 16:09:06 crc kubenswrapper[4749]: I0310 16:09:06.996087 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c"} err="failed to get container status \"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c\": rpc error: code = NotFound desc = could not find container \"d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c\": container with ID starting with d066759fbe510d3b3d4011e379c78fd0e3365acbb90846e807e22696615e0b7c not found: ID does not exist" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.212979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d596895b8-zh48t" event={"ID":"0893ff76-efa3-496c-b499-0f6e3a4ffd59","Type":"ContainerStarted","Data":"c0a96ede134ecf3efecb01f0830126d4278b5249231e5ab86fe4c06e26a90648"} Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.220639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f83d36e6-e860-47a3-8590-d0a468a8819a","Type":"ContainerStarted","Data":"cdce3f6ce3fde6f80be4b89d19db2a485e039cbe3dff5057d1201aad3884b880"} Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.254958 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d4bc56d77-cjnl8"] Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.256877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.259161 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.263840 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-ovndb-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4b6m\" (UniqueName: \"kubernetes.io/projected/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-kube-api-access-b4b6m\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-internal-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-config\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-combined-ca-bundle\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-public-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.297742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-httpd-config\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.302426 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d4bc56d77-cjnl8"] Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-ovndb-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4b6m\" (UniqueName: \"kubernetes.io/projected/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-kube-api-access-b4b6m\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400195 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-internal-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-config\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-combined-ca-bundle\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-public-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.400431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-httpd-config\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.410413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-ovndb-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.411932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-combined-ca-bundle\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.412329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-config\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.412961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-public-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.414124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-httpd-config\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.415596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-internal-tls-certs\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.424290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4b6m\" (UniqueName: \"kubernetes.io/projected/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-kube-api-access-b4b6m\") pod \"neutron-d4bc56d77-cjnl8\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.434586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.577396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.622350 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9fa127-4985-41bf-a8e3-2702dc5f8d2e" path="/var/lib/kubelet/pods/3d9fa127-4985-41bf-a8e3-2702dc5f8d2e/volumes" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.708571 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tbtx4" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.806351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-logs\") pod \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.806464 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-combined-ca-bundle\") pod \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.806496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-config-data\") pod \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.806581 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-scripts\") pod \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.806613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gk2\" (UniqueName: \"kubernetes.io/projected/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-kube-api-access-27gk2\") pod \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\" (UID: \"5d4d52fc-2652-4251-afea-b3d1e39ed0f3\") " Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.806843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-logs" (OuterVolumeSpecName: "logs") pod "5d4d52fc-2652-4251-afea-b3d1e39ed0f3" (UID: "5d4d52fc-2652-4251-afea-b3d1e39ed0f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.807141 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.815210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-scripts" (OuterVolumeSpecName: "scripts") pod "5d4d52fc-2652-4251-afea-b3d1e39ed0f3" (UID: "5d4d52fc-2652-4251-afea-b3d1e39ed0f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.816390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-kube-api-access-27gk2" (OuterVolumeSpecName: "kube-api-access-27gk2") pod "5d4d52fc-2652-4251-afea-b3d1e39ed0f3" (UID: "5d4d52fc-2652-4251-afea-b3d1e39ed0f3"). InnerVolumeSpecName "kube-api-access-27gk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.835906 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-config-data" (OuterVolumeSpecName: "config-data") pod "5d4d52fc-2652-4251-afea-b3d1e39ed0f3" (UID: "5d4d52fc-2652-4251-afea-b3d1e39ed0f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.861542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d4d52fc-2652-4251-afea-b3d1e39ed0f3" (UID: "5d4d52fc-2652-4251-afea-b3d1e39ed0f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.909345 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.909400 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.909409 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:07 crc kubenswrapper[4749]: I0310 16:09:07.909419 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gk2\" (UniqueName: \"kubernetes.io/projected/5d4d52fc-2652-4251-afea-b3d1e39ed0f3-kube-api-access-27gk2\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.224248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d4bc56d77-cjnl8"] Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.232238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tbtx4" event={"ID":"5d4d52fc-2652-4251-afea-b3d1e39ed0f3","Type":"ContainerDied","Data":"badb3cbf5eba73ad3dc9700c085e285054828a3ced8b6a032338ddd38ad6d4c3"} Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.232258 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tbtx4" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.232284 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="badb3cbf5eba73ad3dc9700c085e285054828a3ced8b6a032338ddd38ad6d4c3" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.233681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87889224-54c2-4883-97e8-20e5ad3a8f8b","Type":"ContainerStarted","Data":"bda6a20356cb4a8d7076ab876c23ba60c794e84bf8e69809f88a06da62151e03"} Mar 10 16:09:08 crc kubenswrapper[4749]: W0310 16:09:08.235626 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf0df71_68ed_41f5_8bee_8f2ec91f133f.slice/crio-3df2cecd3588f3c17b90c9f6602510401b4bb02cada25c275a9d4d0b0c1f9595 WatchSource:0}: Error finding container 3df2cecd3588f3c17b90c9f6602510401b4bb02cada25c275a9d4d0b0c1f9595: Status 404 returned error can't find the container with id 3df2cecd3588f3c17b90c9f6602510401b4bb02cada25c275a9d4d0b0c1f9595 Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.344815 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d5d974b56-qhhk8"] Mar 10 16:09:08 crc kubenswrapper[4749]: E0310 16:09:08.345189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4d52fc-2652-4251-afea-b3d1e39ed0f3" containerName="placement-db-sync" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.345206 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4d52fc-2652-4251-afea-b3d1e39ed0f3" containerName="placement-db-sync" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.345363 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4d52fc-2652-4251-afea-b3d1e39ed0f3" containerName="placement-db-sync" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.346285 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.348745 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.349432 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.349633 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.349877 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s4t7x" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.354664 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.370050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d5d974b56-qhhk8"] Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.417555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-internal-tls-certs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.417939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrns9\" (UniqueName: \"kubernetes.io/projected/cc930999-5118-4423-a996-c11f390919f2-kube-api-access-mrns9\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.417986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-combined-ca-bundle\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.418009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-scripts\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.418027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-public-tls-certs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.418063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc930999-5118-4423-a996-c11f390919f2-logs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.418244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-config-data\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-internal-tls-certs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrns9\" (UniqueName: \"kubernetes.io/projected/cc930999-5118-4423-a996-c11f390919f2-kube-api-access-mrns9\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-combined-ca-bundle\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-scripts\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-public-tls-certs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc930999-5118-4423-a996-c11f390919f2-logs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.520908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-config-data\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.522892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc930999-5118-4423-a996-c11f390919f2-logs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.525233 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-scripts\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.525721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-internal-tls-certs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.528505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-config-data\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.530920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-combined-ca-bundle\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.532733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-public-tls-certs\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.541751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrns9\" (UniqueName: \"kubernetes.io/projected/cc930999-5118-4423-a996-c11f390919f2-kube-api-access-mrns9\") pod \"placement-6d5d974b56-qhhk8\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:08 crc kubenswrapper[4749]: I0310 16:09:08.667148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:09 crc kubenswrapper[4749]: I0310 16:09:09.170748 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d5d974b56-qhhk8"] Mar 10 16:09:09 crc kubenswrapper[4749]: W0310 16:09:09.172167 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc930999_5118_4423_a996_c11f390919f2.slice/crio-3e009ff08a655289306604f52acf85efa7bdb3fd98be73ec44893e0e1214a45f WatchSource:0}: Error finding container 3e009ff08a655289306604f52acf85efa7bdb3fd98be73ec44893e0e1214a45f: Status 404 returned error can't find the container with id 3e009ff08a655289306604f52acf85efa7bdb3fd98be73ec44893e0e1214a45f Mar 10 16:09:09 crc kubenswrapper[4749]: I0310 16:09:09.249423 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d4bc56d77-cjnl8" event={"ID":"0cf0df71-68ed-41f5-8bee-8f2ec91f133f","Type":"ContainerStarted","Data":"3df2cecd3588f3c17b90c9f6602510401b4bb02cada25c275a9d4d0b0c1f9595"} Mar 10 16:09:09 crc kubenswrapper[4749]: I0310 16:09:09.251910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5d974b56-qhhk8" event={"ID":"cc930999-5118-4423-a996-c11f390919f2","Type":"ContainerStarted","Data":"3e009ff08a655289306604f52acf85efa7bdb3fd98be73ec44893e0e1214a45f"} Mar 10 16:09:09 crc kubenswrapper[4749]: I0310 16:09:09.254802 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d596895b8-zh48t" event={"ID":"0893ff76-efa3-496c-b499-0f6e3a4ffd59","Type":"ContainerStarted","Data":"a84826754d7077e0e9d7942da7d6c3211342df1be5029fe9d914ac549a6f2958"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.300867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5d974b56-qhhk8" event={"ID":"cc930999-5118-4423-a996-c11f390919f2","Type":"ContainerStarted","Data":"8a3c6f2258745be6f39810ceb82863bebc0e919f31b8df07a4f0c7716b28f461"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.301588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.301604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5d974b56-qhhk8" event={"ID":"cc930999-5118-4423-a996-c11f390919f2","Type":"ContainerStarted","Data":"6821906d92b9f98d07a757898096aceb546ffc4e4e1a632f5c9911d18b877b56"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.301621 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.306568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87889224-54c2-4883-97e8-20e5ad3a8f8b","Type":"ContainerStarted","Data":"adf57b3b88b9c15bb9ab3a7da95e0ca84929d4e9d7da5bc5ffd73b9666f00f7e"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.315417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerStarted","Data":"aa94feb75acb96142e193e0eeaf17ee8d9f0bbf8d774f64014bb8cc27c9204cd"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.329972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" event={"ID":"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf","Type":"ContainerStarted","Data":"844567794d4661f2fc86e99dd154bbb8b52e8301aacdda4f107bc80fcdb4d8b0"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.330202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.332656 4749 generic.go:334] "Generic (PLEG): container finished" podID="5dbeb14b-95ab-438e-b3e5-be66a6c34188" containerID="71646d3d80a269ec395cc73be3f051a86dc38c361242cd8c53640573173be447" exitCode=0 Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.332733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxg2v" event={"ID":"5dbeb14b-95ab-438e-b3e5-be66a6c34188","Type":"ContainerDied","Data":"71646d3d80a269ec395cc73be3f051a86dc38c361242cd8c53640573173be447"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.334982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d596895b8-zh48t" event={"ID":"0893ff76-efa3-496c-b499-0f6e3a4ffd59","Type":"ContainerStarted","Data":"47d39ee2038a7c20b81e06ce8eb1f8cc68d20deef858b1d0a015b6fa325c1429"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.335984 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.342971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d4bc56d77-cjnl8" event={"ID":"0cf0df71-68ed-41f5-8bee-8f2ec91f133f","Type":"ContainerStarted","Data":"52861deaa6b8da0398f6ad619b2142325877fc700973aeb0570d3800fe2acd5b"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.343007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d4bc56d77-cjnl8" event={"ID":"0cf0df71-68ed-41f5-8bee-8f2ec91f133f","Type":"ContainerStarted","Data":"6384072e7eda3838a0f4a2261ffc6756be1dc38df934506afb80c689b299737c"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.343548 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.347104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f83d36e6-e860-47a3-8590-d0a468a8819a","Type":"ContainerStarted","Data":"8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.352798 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" podStartSLOduration=6.352781689 podStartE2EDuration="6.352781689s" podCreationTimestamp="2026-03-10 16:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:10.347144154 +0000 UTC m=+1247.469009841" watchObservedRunningTime="2026-03-10 16:09:10.352781689 +0000 UTC m=+1247.474647376" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.355017 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d5d974b56-qhhk8" podStartSLOduration=2.355007131 podStartE2EDuration="2.355007131s" podCreationTimestamp="2026-03-10 16:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:10.328603733 +0000 UTC m=+1247.450469420" watchObservedRunningTime="2026-03-10 16:09:10.355007131 +0000 UTC m=+1247.476872828" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.370622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwdl7" event={"ID":"8f6aaf20-62e0-47eb-b54d-6edbdf95e770","Type":"ContainerStarted","Data":"0ef5e6d29e7ae24fea6a3d6d6e029a7cbf120f70ff6cc023e2ac5c6e35b2fbcd"} Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.373071 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.373052168 podStartE2EDuration="5.373052168s" podCreationTimestamp="2026-03-10 16:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:10.366739304 +0000 UTC m=+1247.488605001" watchObservedRunningTime="2026-03-10 16:09:10.373052168 +0000 UTC m=+1247.494917855" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.396690 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d596895b8-zh48t" podStartSLOduration=6.396670399 podStartE2EDuration="6.396670399s" podCreationTimestamp="2026-03-10 16:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:10.388808492 +0000 UTC m=+1247.510674189" watchObservedRunningTime="2026-03-10 16:09:10.396670399 +0000 UTC m=+1247.518536086" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.445092 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d4bc56d77-cjnl8" podStartSLOduration=3.445076163 podStartE2EDuration="3.445076163s" podCreationTimestamp="2026-03-10 16:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:10.442282676 +0000 UTC m=+1247.564148353" watchObservedRunningTime="2026-03-10 16:09:10.445076163 +0000 UTC m=+1247.566941850" Mar 10 16:09:10 crc kubenswrapper[4749]: I0310 16:09:10.468479 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gwdl7" podStartSLOduration=2.516775949 podStartE2EDuration="33.468459317s" podCreationTimestamp="2026-03-10 16:08:37 +0000 UTC" firstStartedPulling="2026-03-10 16:08:38.756263128 +0000 UTC m=+1215.878128815" lastFinishedPulling="2026-03-10 16:09:09.707946496 +0000 UTC m=+1246.829812183" observedRunningTime="2026-03-10 16:09:10.464296193 +0000 UTC m=+1247.586161900" watchObservedRunningTime="2026-03-10 16:09:10.468459317 +0000 UTC m=+1247.590325004" Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.393512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87889224-54c2-4883-97e8-20e5ad3a8f8b","Type":"ContainerStarted","Data":"7685d7f3d9ad18b7fcad8e920aa20e44c5cb9ccf8d53e0fd9f48a7461eb5386f"} Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.397941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f83d36e6-e860-47a3-8590-d0a468a8819a","Type":"ContainerStarted","Data":"c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5"} Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.428539 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.428514249 podStartE2EDuration="5.428514249s" podCreationTimestamp="2026-03-10 16:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:11.422478533 +0000 UTC m=+1248.544344240" watchObservedRunningTime="2026-03-10 16:09:11.428514249 +0000 UTC m=+1248.550379936" Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.831641 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.987916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-credential-keys\") pod \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.988316 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-combined-ca-bundle\") pod \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.988337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bkzc\" (UniqueName: \"kubernetes.io/projected/5dbeb14b-95ab-438e-b3e5-be66a6c34188-kube-api-access-7bkzc\") pod \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.988407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-config-data\") pod \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.988535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-fernet-keys\") pod \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.988566 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-scripts\") pod \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\" (UID: \"5dbeb14b-95ab-438e-b3e5-be66a6c34188\") " Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.994311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-scripts" (OuterVolumeSpecName: "scripts") pod "5dbeb14b-95ab-438e-b3e5-be66a6c34188" (UID: "5dbeb14b-95ab-438e-b3e5-be66a6c34188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.996732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbeb14b-95ab-438e-b3e5-be66a6c34188-kube-api-access-7bkzc" (OuterVolumeSpecName: "kube-api-access-7bkzc") pod "5dbeb14b-95ab-438e-b3e5-be66a6c34188" (UID: "5dbeb14b-95ab-438e-b3e5-be66a6c34188"). InnerVolumeSpecName "kube-api-access-7bkzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:11 crc kubenswrapper[4749]: I0310 16:09:11.996812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5dbeb14b-95ab-438e-b3e5-be66a6c34188" (UID: "5dbeb14b-95ab-438e-b3e5-be66a6c34188"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.001516 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5dbeb14b-95ab-438e-b3e5-be66a6c34188" (UID: "5dbeb14b-95ab-438e-b3e5-be66a6c34188"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.019645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dbeb14b-95ab-438e-b3e5-be66a6c34188" (UID: "5dbeb14b-95ab-438e-b3e5-be66a6c34188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.025183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-config-data" (OuterVolumeSpecName: "config-data") pod "5dbeb14b-95ab-438e-b3e5-be66a6c34188" (UID: "5dbeb14b-95ab-438e-b3e5-be66a6c34188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.091246 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.091278 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.091288 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bkzc\" (UniqueName: \"kubernetes.io/projected/5dbeb14b-95ab-438e-b3e5-be66a6c34188-kube-api-access-7bkzc\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.091299 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.091307 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.091346 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbeb14b-95ab-438e-b3e5-be66a6c34188-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.406875 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vxg2v" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.409408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vxg2v" event={"ID":"5dbeb14b-95ab-438e-b3e5-be66a6c34188","Type":"ContainerDied","Data":"d6367073ae8bc849e0cfdc1245440a8158efc19856c85299de3538dff1ae44c1"} Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.409445 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6367073ae8bc849e0cfdc1245440a8158efc19856c85299de3538dff1ae44c1" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.621063 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d6cd8c57d-9v7dx"] Mar 10 16:09:12 crc kubenswrapper[4749]: E0310 16:09:12.621575 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbeb14b-95ab-438e-b3e5-be66a6c34188" containerName="keystone-bootstrap" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.621591 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbeb14b-95ab-438e-b3e5-be66a6c34188" containerName="keystone-bootstrap" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.621785 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbeb14b-95ab-438e-b3e5-be66a6c34188" containerName="keystone-bootstrap" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.623590 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.629099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.629177 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.629567 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.629698 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.629816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ppc5j" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.629922 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.633907 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d6cd8c57d-9v7dx"] Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.809878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-public-tls-certs\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.809952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-credential-keys\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.810079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-internal-tls-certs\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.810111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-combined-ca-bundle\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.810187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc74m\" (UniqueName: \"kubernetes.io/projected/a7637a97-25f4-4696-a41c-545d0d6b0e9a-kube-api-access-dc74m\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.810353 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-fernet-keys\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.810482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-scripts\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.810521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-config-data\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.912337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc74m\" (UniqueName: \"kubernetes.io/projected/a7637a97-25f4-4696-a41c-545d0d6b0e9a-kube-api-access-dc74m\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.912460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-fernet-keys\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.912494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-scripts\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.912510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-config-data\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.913269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-public-tls-certs\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.913351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-credential-keys\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.913443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-internal-tls-certs\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.913468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-combined-ca-bundle\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.917728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-credential-keys\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.918483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-scripts\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.918943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-internal-tls-certs\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.919818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-combined-ca-bundle\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.921691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-fernet-keys\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.922368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-public-tls-certs\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.923086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-config-data\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.936198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc74m\" (UniqueName: \"kubernetes.io/projected/a7637a97-25f4-4696-a41c-545d0d6b0e9a-kube-api-access-dc74m\") pod \"keystone-d6cd8c57d-9v7dx\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:12 crc kubenswrapper[4749]: I0310 16:09:12.946861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:13 crc kubenswrapper[4749]: I0310 16:09:13.420661 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" containerID="0ef5e6d29e7ae24fea6a3d6d6e029a7cbf120f70ff6cc023e2ac5c6e35b2fbcd" exitCode=0 Mar 10 16:09:13 crc kubenswrapper[4749]: I0310 16:09:13.420762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwdl7" event={"ID":"8f6aaf20-62e0-47eb-b54d-6edbdf95e770","Type":"ContainerDied","Data":"0ef5e6d29e7ae24fea6a3d6d6e029a7cbf120f70ff6cc023e2ac5c6e35b2fbcd"} Mar 10 16:09:13 crc kubenswrapper[4749]: I0310 16:09:13.458833 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d6cd8c57d-9v7dx"] Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.040256 4749 scope.go:117] "RemoveContainer" containerID="5eb2f36eb800ec1839f8b6d93dbba154cda17aa966028a77c2bbf78455bc15bf" Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.432267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d6cd8c57d-9v7dx" event={"ID":"a7637a97-25f4-4696-a41c-545d0d6b0e9a","Type":"ContainerStarted","Data":"261594765d431b29d11923174d8f5b406353566732ec1cf061a23975229933f0"} Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.432323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d6cd8c57d-9v7dx" event={"ID":"a7637a97-25f4-4696-a41c-545d0d6b0e9a","Type":"ContainerStarted","Data":"389c8bdcf32aecbae0e9690425b24528a8e6771107e0534249c66451d5d9a23f"} Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.432360 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.451740 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d6cd8c57d-9v7dx" podStartSLOduration=2.451710467 podStartE2EDuration="2.451710467s" podCreationTimestamp="2026-03-10 16:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:14.449719552 +0000 UTC m=+1251.571585249" watchObservedRunningTime="2026-03-10 16:09:14.451710467 +0000 UTC m=+1251.573576174" Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.755666 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.808626 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-gx2n2"] Mar 10 16:09:14 crc kubenswrapper[4749]: I0310 16:09:14.808846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerName="dnsmasq-dns" containerID="cri-o://c950f119d2732837fc7212b09951b970610cc472675fbebbfa2aa0c72cf010ef" gracePeriod=10 Mar 10 16:09:15 crc kubenswrapper[4749]: I0310 16:09:15.449683 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerID="c950f119d2732837fc7212b09951b970610cc472675fbebbfa2aa0c72cf010ef" exitCode=0 Mar 10 16:09:15 crc kubenswrapper[4749]: I0310 16:09:15.449934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" event={"ID":"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0","Type":"ContainerDied","Data":"c950f119d2732837fc7212b09951b970610cc472675fbebbfa2aa0c72cf010ef"} Mar 10 16:09:15 crc kubenswrapper[4749]: I0310 16:09:15.635860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 16:09:15 crc kubenswrapper[4749]: I0310 16:09:15.635928 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 16:09:15 crc kubenswrapper[4749]: I0310 16:09:15.691595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 16:09:15 crc kubenswrapper[4749]: I0310 16:09:15.807977 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.477818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwdl7" event={"ID":"8f6aaf20-62e0-47eb-b54d-6edbdf95e770","Type":"ContainerDied","Data":"31953bdc70f25cf2f794f2c49f79104ada253ebaa98d6e165f75d83ee398f90d"} Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.477901 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31953bdc70f25cf2f794f2c49f79104ada253ebaa98d6e165f75d83ee398f90d" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.477943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.478192 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.531441 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.602411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.604005 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.657729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.684820 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqh5l\" (UniqueName: \"kubernetes.io/projected/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-kube-api-access-mqh5l\") pod \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.685018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-db-sync-config-data\") pod \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.685125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-combined-ca-bundle\") pod \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\" (UID: \"8f6aaf20-62e0-47eb-b54d-6edbdf95e770\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.694712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8f6aaf20-62e0-47eb-b54d-6edbdf95e770" (UID: "8f6aaf20-62e0-47eb-b54d-6edbdf95e770"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.698273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-kube-api-access-mqh5l" (OuterVolumeSpecName: "kube-api-access-mqh5l") pod "8f6aaf20-62e0-47eb-b54d-6edbdf95e770" (UID: "8f6aaf20-62e0-47eb-b54d-6edbdf95e770"). InnerVolumeSpecName "kube-api-access-mqh5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.700563 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.728528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f6aaf20-62e0-47eb-b54d-6edbdf95e770" (UID: "8f6aaf20-62e0-47eb-b54d-6edbdf95e770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.766600 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.791290 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.791311 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.791322 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqh5l\" (UniqueName: \"kubernetes.io/projected/8f6aaf20-62e0-47eb-b54d-6edbdf95e770-kube-api-access-mqh5l\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.892837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-nb\") pod \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.893145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-svc\") pod \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.893412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-sb\") pod \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.893887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhgx4\" (UniqueName: \"kubernetes.io/projected/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-kube-api-access-qhgx4\") pod \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.894234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-config\") pod \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.894368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-swift-storage-0\") pod \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\" (UID: \"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0\") " Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.897900 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-kube-api-access-qhgx4" (OuterVolumeSpecName: "kube-api-access-qhgx4") pod "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" (UID: "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0"). InnerVolumeSpecName "kube-api-access-qhgx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.937242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" (UID: "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.941565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" (UID: "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.946591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-config" (OuterVolumeSpecName: "config") pod "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" (UID: "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.946997 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" (UID: "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.947606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" (UID: "3b008b4c-16e9-4cff-a2e5-06e0a6936cd0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.996696 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.996727 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhgx4\" (UniqueName: \"kubernetes.io/projected/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-kube-api-access-qhgx4\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.996737 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.996746 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.996755 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:16 crc kubenswrapper[4749]: I0310 16:09:16.996763 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.490769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerStarted","Data":"69726bc538fb4d8696a56b555259e84c8f0c2ad08bcb562273e4b98326de76d5"} Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.495701 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.496447 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwdl7" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.498235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6478444fbc-gx2n2" event={"ID":"3b008b4c-16e9-4cff-a2e5-06e0a6936cd0","Type":"ContainerDied","Data":"587e67ad90a66b1d493a5ca3cd8e8eddc18b50697548124b66650655a2b94482"} Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.498306 4749 scope.go:117] "RemoveContainer" containerID="c950f119d2732837fc7212b09951b970610cc472675fbebbfa2aa0c72cf010ef" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.498620 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.498911 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.539247 4749 scope.go:117] "RemoveContainer" containerID="1da76019397177047d8a7f49c25186d89be470dc084537b7618a4374933a58fe" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.596469 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-gx2n2"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.627270 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6478444fbc-gx2n2"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.754066 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-94bd49868-nj59v"] Mar 10 16:09:17 crc kubenswrapper[4749]: E0310 16:09:17.754437 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" containerName="barbican-db-sync" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.754453 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" containerName="barbican-db-sync" Mar 10 16:09:17 crc kubenswrapper[4749]: E0310 16:09:17.754495 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerName="dnsmasq-dns" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.754501 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerName="dnsmasq-dns" Mar 10 16:09:17 crc kubenswrapper[4749]: E0310 16:09:17.754513 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerName="init" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.754520 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerName="init" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.754694 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" containerName="dnsmasq-dns" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.754739 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" containerName="barbican-db-sync" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.755729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.758645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.758873 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.768826 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zjqk6" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.772598 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-67dc88fb49-f9s7n"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.782518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.787047 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.806625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-94bd49868-nj59v"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87zp\" (UniqueName: \"kubernetes.io/projected/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-kube-api-access-x87zp\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-combined-ca-bundle\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-logs\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data-custom\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data-custom\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqg75\" (UniqueName: \"kubernetes.io/projected/7e75ef50-1c0b-498e-8448-39a7c8912f96-kube-api-access-xqg75\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850207 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850229 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850261 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e75ef50-1c0b-498e-8448-39a7c8912f96-logs\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.850289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-combined-ca-bundle\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.860220 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67dc88fb49-f9s7n"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.918545 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dx4gn"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.920321 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.936559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dx4gn"] Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqg75\" (UniqueName: \"kubernetes.io/projected/7e75ef50-1c0b-498e-8448-39a7c8912f96-kube-api-access-xqg75\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e75ef50-1c0b-498e-8448-39a7c8912f96-logs\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-combined-ca-bundle\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x87zp\" (UniqueName: \"kubernetes.io/projected/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-kube-api-access-x87zp\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-combined-ca-bundle\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-logs\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data-custom\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.952650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data-custom\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.953428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e75ef50-1c0b-498e-8448-39a7c8912f96-logs\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.962026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-logs\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.966036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-combined-ca-bundle\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.967113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-combined-ca-bundle\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.972791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.979330 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data-custom\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.979984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data-custom\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.986474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:17 crc kubenswrapper[4749]: I0310 16:09:17.997960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x87zp\" (UniqueName: \"kubernetes.io/projected/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-kube-api-access-x87zp\") pod \"barbican-worker-67dc88fb49-f9s7n\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.056608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqg75\" (UniqueName: \"kubernetes.io/projected/7e75ef50-1c0b-498e-8448-39a7c8912f96-kube-api-access-xqg75\") pod \"barbican-keystone-listener-94bd49868-nj59v\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.062033 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79cc84f6-m4lp5"] Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.063761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.064775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.064852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-svc\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.064891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-swift-storage-0\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.064997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hflch\" (UniqueName: \"kubernetes.io/projected/db16e75b-2bca-4ba2-a169-146ceb4ab23d-kube-api-access-hflch\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.065050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.065112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-config\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.070260 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.078681 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79cc84f6-m4lp5"] Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.112953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.136457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.166980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data-custom\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.167059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-config\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.167102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-combined-ca-bundle\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.167136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.167167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-svc\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.167936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-svc\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.168124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-config\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170420 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-swift-storage-0\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e775d653-0d4c-4cb9-bed2-962b6589c5ee-logs\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9pt\" (UniqueName: \"kubernetes.io/projected/e775d653-0d4c-4cb9-bed2-962b6589c5ee-kube-api-access-sp9pt\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hflch\" (UniqueName: \"kubernetes.io/projected/db16e75b-2bca-4ba2-a169-146ceb4ab23d-kube-api-access-hflch\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.170998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.172013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.172583 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-swift-storage-0\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.191390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hflch\" (UniqueName: \"kubernetes.io/projected/db16e75b-2bca-4ba2-a169-146ceb4ab23d-kube-api-access-hflch\") pod \"dnsmasq-dns-5b78c5c5d5-dx4gn\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.253584 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.272758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-combined-ca-bundle\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.272853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.272928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e775d653-0d4c-4cb9-bed2-962b6589c5ee-logs\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.272958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9pt\" (UniqueName: \"kubernetes.io/projected/e775d653-0d4c-4cb9-bed2-962b6589c5ee-kube-api-access-sp9pt\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.273018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data-custom\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.274835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e775d653-0d4c-4cb9-bed2-962b6589c5ee-logs\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.283642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-combined-ca-bundle\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.284481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data-custom\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.286666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.324203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9pt\" (UniqueName: \"kubernetes.io/projected/e775d653-0d4c-4cb9-bed2-962b6589c5ee-kube-api-access-sp9pt\") pod \"barbican-api-79cc84f6-m4lp5\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.418112 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.525554 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.526020 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.624568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67dc88fb49-f9s7n"] Mar 10 16:09:18 crc kubenswrapper[4749]: I0310 16:09:18.797746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-94bd49868-nj59v"] Mar 10 16:09:18 crc kubenswrapper[4749]: W0310 16:09:18.804091 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e75ef50_1c0b_498e_8448_39a7c8912f96.slice/crio-f6cbc30e46a0924e0dc742ff7bb9f702d15b8ccd9fe4cf6cff5c479718b4a71f WatchSource:0}: Error finding container f6cbc30e46a0924e0dc742ff7bb9f702d15b8ccd9fe4cf6cff5c479718b4a71f: Status 404 returned error can't find the container with id f6cbc30e46a0924e0dc742ff7bb9f702d15b8ccd9fe4cf6cff5c479718b4a71f Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.015991 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dx4gn"] Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.136867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79cc84f6-m4lp5"] Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.433565 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.435083 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.560079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff4lh" event={"ID":"876272e9-3af8-40ba-aac7-40f8cecc909e","Type":"ContainerStarted","Data":"1dc1d42c78b2ad1937fd19367d842bf5ad46d49552fa234892aba8a8ed5f0cf3"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.571121 4749 generic.go:334] "Generic (PLEG): container finished" podID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerID="d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938" exitCode=0 Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.571328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" event={"ID":"db16e75b-2bca-4ba2-a169-146ceb4ab23d","Type":"ContainerDied","Data":"d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.571410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" event={"ID":"db16e75b-2bca-4ba2-a169-146ceb4ab23d","Type":"ContainerStarted","Data":"c7cb399ede7ff4559fb9110dcfe3943e07272c0f6149b1c48493d85abe3881a0"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.576243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67dc88fb49-f9s7n" event={"ID":"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3","Type":"ContainerStarted","Data":"da32b1813adbf953923f98ec50815b367083a3884d4405627ee737ccbf076a03"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.585700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" event={"ID":"7e75ef50-1c0b-498e-8448-39a7c8912f96","Type":"ContainerStarted","Data":"f6cbc30e46a0924e0dc742ff7bb9f702d15b8ccd9fe4cf6cff5c479718b4a71f"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.586675 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ff4lh" podStartSLOduration=3.612446972 podStartE2EDuration="43.586655191s" podCreationTimestamp="2026-03-10 16:08:36 +0000 UTC" firstStartedPulling="2026-03-10 16:08:38.334692594 +0000 UTC m=+1215.456558271" lastFinishedPulling="2026-03-10 16:09:18.308900803 +0000 UTC m=+1255.430766490" observedRunningTime="2026-03-10 16:09:19.578230859 +0000 UTC m=+1256.700096546" watchObservedRunningTime="2026-03-10 16:09:19.586655191 +0000 UTC m=+1256.708520868" Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.599049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79cc84f6-m4lp5" event={"ID":"e775d653-0d4c-4cb9-bed2-962b6589c5ee","Type":"ContainerStarted","Data":"619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.599122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79cc84f6-m4lp5" event={"ID":"e775d653-0d4c-4cb9-bed2-962b6589c5ee","Type":"ContainerStarted","Data":"4764a6089ad0a91f0547616b14277ed00d7574ad2d44d3b2cf723850a20f7504"} Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.632970 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b008b4c-16e9-4cff-a2e5-06e0a6936cd0" path="/var/lib/kubelet/pods/3b008b4c-16e9-4cff-a2e5-06e0a6936cd0/volumes" Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.987147 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:19 crc kubenswrapper[4749]: I0310 16:09:19.987268 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.624018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79cc84f6-m4lp5" event={"ID":"e775d653-0d4c-4cb9-bed2-962b6589c5ee","Type":"ContainerStarted","Data":"a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644"} Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.624078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.624093 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.635402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" event={"ID":"db16e75b-2bca-4ba2-a169-146ceb4ab23d","Type":"ContainerStarted","Data":"93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4"} Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.635892 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.659749 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79cc84f6-m4lp5" podStartSLOduration=3.659724068 podStartE2EDuration="3.659724068s" podCreationTimestamp="2026-03-10 16:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:20.646816682 +0000 UTC m=+1257.768682369" watchObservedRunningTime="2026-03-10 16:09:20.659724068 +0000 UTC m=+1257.781589755" Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.665196 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 16:09:20 crc kubenswrapper[4749]: I0310 16:09:20.676008 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" podStartSLOduration=3.675985376 podStartE2EDuration="3.675985376s" podCreationTimestamp="2026-03-10 16:09:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:20.672718366 +0000 UTC m=+1257.794584073" watchObservedRunningTime="2026-03-10 16:09:20.675985376 +0000 UTC m=+1257.797851063" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.279892 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f7b864884-n5l5z"] Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.302097 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.311617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.311716 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.359568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f7b864884-n5l5z"] Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.465743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.465832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-public-tls-certs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.466136 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-logs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.466181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-kube-api-access-px8j6\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.466209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-combined-ca-bundle\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.466252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data-custom\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.466296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-internal-tls-certs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-kube-api-access-px8j6\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-combined-ca-bundle\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data-custom\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-internal-tls-certs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-public-tls-certs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571471 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-logs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.571858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-logs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.579445 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data-custom\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.584341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-combined-ca-bundle\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.592478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-public-tls-certs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.594868 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-internal-tls-certs\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.595151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.598687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-kube-api-access-px8j6\") pod \"barbican-api-6f7b864884-n5l5z\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:21 crc kubenswrapper[4749]: I0310 16:09:21.733244 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.052878 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f7b864884-n5l5z"] Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.690183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67dc88fb49-f9s7n" event={"ID":"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3","Type":"ContainerStarted","Data":"eb58db301501949b729719960a2218e0ade2ee5b0f920eebc51295d80357ecf6"} Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.691882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67dc88fb49-f9s7n" event={"ID":"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3","Type":"ContainerStarted","Data":"7ba23112caa9623ef0f1361b165f1a5f82eb24b9e458cf8a3e1047489301781f"} Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.693364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" event={"ID":"7e75ef50-1c0b-498e-8448-39a7c8912f96","Type":"ContainerStarted","Data":"9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8"} Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.693414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" event={"ID":"7e75ef50-1c0b-498e-8448-39a7c8912f96","Type":"ContainerStarted","Data":"9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb"} Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.716581 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-67dc88fb49-f9s7n" podStartSLOduration=2.797719139 podStartE2EDuration="6.716558203s" podCreationTimestamp="2026-03-10 16:09:17 +0000 UTC" firstStartedPulling="2026-03-10 16:09:18.651997559 +0000 UTC m=+1255.773863246" lastFinishedPulling="2026-03-10 16:09:22.570836603 +0000 UTC m=+1259.692702310" observedRunningTime="2026-03-10 16:09:23.709414916 +0000 UTC m=+1260.831280593" watchObservedRunningTime="2026-03-10 16:09:23.716558203 +0000 UTC m=+1260.838423890" Mar 10 16:09:23 crc kubenswrapper[4749]: I0310 16:09:23.747300 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" podStartSLOduration=3.009118455 podStartE2EDuration="6.747281599s" podCreationTimestamp="2026-03-10 16:09:17 +0000 UTC" firstStartedPulling="2026-03-10 16:09:18.833136102 +0000 UTC m=+1255.955001779" lastFinishedPulling="2026-03-10 16:09:22.571299246 +0000 UTC m=+1259.693164923" observedRunningTime="2026-03-10 16:09:23.730340493 +0000 UTC m=+1260.852206200" watchObservedRunningTime="2026-03-10 16:09:23.747281599 +0000 UTC m=+1260.869147286" Mar 10 16:09:28 crc kubenswrapper[4749]: I0310 16:09:28.277428 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:28 crc kubenswrapper[4749]: I0310 16:09:28.342066 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-tnqz6"] Mar 10 16:09:28 crc kubenswrapper[4749]: I0310 16:09:28.342353 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="dnsmasq-dns" containerID="cri-o://844567794d4661f2fc86e99dd154bbb8b52e8301aacdda4f107bc80fcdb4d8b0" gracePeriod=10 Mar 10 16:09:28 crc kubenswrapper[4749]: I0310 16:09:28.757432 4749 generic.go:334] "Generic (PLEG): container finished" podID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerID="844567794d4661f2fc86e99dd154bbb8b52e8301aacdda4f107bc80fcdb4d8b0" exitCode=0 Mar 10 16:09:28 crc kubenswrapper[4749]: I0310 16:09:28.757428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" event={"ID":"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf","Type":"ContainerDied","Data":"844567794d4661f2fc86e99dd154bbb8b52e8301aacdda4f107bc80fcdb4d8b0"} Mar 10 16:09:29 crc kubenswrapper[4749]: I0310 16:09:29.755058 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Mar 10 16:09:29 crc kubenswrapper[4749]: I0310 16:09:29.813034 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:29 crc kubenswrapper[4749]: I0310 16:09:29.890592 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:30 crc kubenswrapper[4749]: W0310 16:09:30.500470 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd8a90f3_a6d3_428e_a049_78cb36e2ed34.slice/crio-1f59993e0f183c523897becf9b929953b64a82be498862ee5a179b1f59b4563d WatchSource:0}: Error finding container 1f59993e0f183c523897becf9b929953b64a82be498862ee5a179b1f59b4563d: Status 404 returned error can't find the container with id 1f59993e0f183c523897becf9b929953b64a82be498862ee5a179b1f59b4563d Mar 10 16:09:30 crc kubenswrapper[4749]: I0310 16:09:30.803482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b864884-n5l5z" event={"ID":"fd8a90f3-a6d3-428e-a049-78cb36e2ed34","Type":"ContainerStarted","Data":"1f59993e0f183c523897becf9b929953b64a82be498862ee5a179b1f59b4563d"} Mar 10 16:09:30 crc kubenswrapper[4749]: I0310 16:09:30.924223 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:30.999836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-svc\") pod \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.000275 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5nxq\" (UniqueName: \"kubernetes.io/projected/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-kube-api-access-j5nxq\") pod \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.000343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-sb\") pod \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.000445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-swift-storage-0\") pod \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.000482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-config\") pod \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.000504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-nb\") pod \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\" (UID: \"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf\") " Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.008556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-kube-api-access-j5nxq" (OuterVolumeSpecName: "kube-api-access-j5nxq") pod "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" (UID: "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf"). InnerVolumeSpecName "kube-api-access-j5nxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.008765 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5nxq\" (UniqueName: \"kubernetes.io/projected/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-kube-api-access-j5nxq\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.057654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" (UID: "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.058179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" (UID: "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.064139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" (UID: "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.072232 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-config" (OuterVolumeSpecName: "config") pod "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" (UID: "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.073713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" (UID: "0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.110451 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.110486 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.110499 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.110508 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.110517 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:31 crc kubenswrapper[4749]: E0310 16:09:31.751028 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f2376b2_8792_4f6e_ba0e_d5cfea25dfdf.slice/crio-1cd7e4ec28667d6a965395522a1c8413b47d90fe7c88adb6ccc025da63908169\": RecentStats: unable to find data in memory cache]" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.809702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerStarted","Data":"0faa0d56e15958b323118c196375eb00d682d0a86018f28491d5a9fbeca5a040"} Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.809852 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="proxy-httpd" containerID="cri-o://0faa0d56e15958b323118c196375eb00d682d0a86018f28491d5a9fbeca5a040" gracePeriod=30 Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.809872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.809843 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-central-agent" containerID="cri-o://ed18f1b882ca0e625ea1d9e295570112cd67ff4379d310dfc0b3721c89d6483d" gracePeriod=30 Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.809902 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="sg-core" containerID="cri-o://69726bc538fb4d8696a56b555259e84c8f0c2ad08bcb562273e4b98326de76d5" gracePeriod=30 Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.809902 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-notification-agent" containerID="cri-o://aa94feb75acb96142e193e0eeaf17ee8d9f0bbf8d774f64014bb8cc27c9204cd" gracePeriod=30 Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.813625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" event={"ID":"0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf","Type":"ContainerDied","Data":"1cd7e4ec28667d6a965395522a1c8413b47d90fe7c88adb6ccc025da63908169"} Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.813667 4749 scope.go:117] "RemoveContainer" containerID="844567794d4661f2fc86e99dd154bbb8b52e8301aacdda4f107bc80fcdb4d8b0" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.813788 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5864dc4585-tnqz6" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.819770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b864884-n5l5z" event={"ID":"fd8a90f3-a6d3-428e-a049-78cb36e2ed34","Type":"ContainerStarted","Data":"a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9"} Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.819808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b864884-n5l5z" event={"ID":"fd8a90f3-a6d3-428e-a049-78cb36e2ed34","Type":"ContainerStarted","Data":"8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee"} Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.819993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.820059 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.840888 4749 scope.go:117] "RemoveContainer" containerID="0f457c92d0e63c17ba27eb4cb2bcf081fc93033f435b7a6f34fadb128d1be857" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.845300 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.004519133 podStartE2EDuration="54.845275553s" podCreationTimestamp="2026-03-10 16:08:37 +0000 UTC" firstStartedPulling="2026-03-10 16:08:38.848335805 +0000 UTC m=+1215.970201492" lastFinishedPulling="2026-03-10 16:09:30.689092215 +0000 UTC m=+1267.810957912" observedRunningTime="2026-03-10 16:09:31.835959986 +0000 UTC m=+1268.957825683" watchObservedRunningTime="2026-03-10 16:09:31.845275553 +0000 UTC m=+1268.967141260" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.857726 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f7b864884-n5l5z" podStartSLOduration=10.857706375 podStartE2EDuration="10.857706375s" podCreationTimestamp="2026-03-10 16:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:31.856557864 +0000 UTC m=+1268.978423551" watchObservedRunningTime="2026-03-10 16:09:31.857706375 +0000 UTC m=+1268.979572062" Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.882063 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-tnqz6"] Mar 10 16:09:31 crc kubenswrapper[4749]: I0310 16:09:31.889740 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5864dc4585-tnqz6"] Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.843601 4749 generic.go:334] "Generic (PLEG): container finished" podID="876272e9-3af8-40ba-aac7-40f8cecc909e" containerID="1dc1d42c78b2ad1937fd19367d842bf5ad46d49552fa234892aba8a8ed5f0cf3" exitCode=0 Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.843694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff4lh" event={"ID":"876272e9-3af8-40ba-aac7-40f8cecc909e","Type":"ContainerDied","Data":"1dc1d42c78b2ad1937fd19367d842bf5ad46d49552fa234892aba8a8ed5f0cf3"} Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851608 4749 generic.go:334] "Generic (PLEG): container finished" podID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerID="0faa0d56e15958b323118c196375eb00d682d0a86018f28491d5a9fbeca5a040" exitCode=0 Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851634 4749 generic.go:334] "Generic (PLEG): container finished" podID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerID="69726bc538fb4d8696a56b555259e84c8f0c2ad08bcb562273e4b98326de76d5" exitCode=2 Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851642 4749 generic.go:334] "Generic (PLEG): container finished" podID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerID="aa94feb75acb96142e193e0eeaf17ee8d9f0bbf8d774f64014bb8cc27c9204cd" exitCode=0 Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851649 4749 generic.go:334] "Generic (PLEG): container finished" podID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerID="ed18f1b882ca0e625ea1d9e295570112cd67ff4379d310dfc0b3721c89d6483d" exitCode=0 Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerDied","Data":"0faa0d56e15958b323118c196375eb00d682d0a86018f28491d5a9fbeca5a040"} Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerDied","Data":"69726bc538fb4d8696a56b555259e84c8f0c2ad08bcb562273e4b98326de76d5"} Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerDied","Data":"aa94feb75acb96142e193e0eeaf17ee8d9f0bbf8d774f64014bb8cc27c9204cd"} Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.851814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerDied","Data":"ed18f1b882ca0e625ea1d9e295570112cd67ff4379d310dfc0b3721c89d6483d"} Mar 10 16:09:32 crc kubenswrapper[4749]: I0310 16:09:32.995931 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.148006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-config-data\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.148339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-sg-core-conf-yaml\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.148810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-combined-ca-bundle\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.149060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-run-httpd\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.149252 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-scripts\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.149465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6w4s\" (UniqueName: \"kubernetes.io/projected/3fea9311-0e47-4352-8d4f-ac90db816fc1-kube-api-access-m6w4s\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.149695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-log-httpd\") pod \"3fea9311-0e47-4352-8d4f-ac90db816fc1\" (UID: \"3fea9311-0e47-4352-8d4f-ac90db816fc1\") " Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.149546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.150439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.150693 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.154276 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fea9311-0e47-4352-8d4f-ac90db816fc1-kube-api-access-m6w4s" (OuterVolumeSpecName: "kube-api-access-m6w4s") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "kube-api-access-m6w4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.154442 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-scripts" (OuterVolumeSpecName: "scripts") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.174913 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.221609 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.252159 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fea9311-0e47-4352-8d4f-ac90db816fc1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.252194 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.252208 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.252223 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.252234 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6w4s\" (UniqueName: \"kubernetes.io/projected/3fea9311-0e47-4352-8d4f-ac90db816fc1-kube-api-access-m6w4s\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.261202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-config-data" (OuterVolumeSpecName: "config-data") pod "3fea9311-0e47-4352-8d4f-ac90db816fc1" (UID: "3fea9311-0e47-4352-8d4f-ac90db816fc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.354483 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fea9311-0e47-4352-8d4f-ac90db816fc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.631301 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" path="/var/lib/kubelet/pods/0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf/volumes" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.863257 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.863328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fea9311-0e47-4352-8d4f-ac90db816fc1","Type":"ContainerDied","Data":"3960ea3070dcbafeb98063c9c59c53d1c83b53838e9d56884ff6de370ae06aaa"} Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.863364 4749 scope.go:117] "RemoveContainer" containerID="0faa0d56e15958b323118c196375eb00d682d0a86018f28491d5a9fbeca5a040" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.888124 4749 scope.go:117] "RemoveContainer" containerID="69726bc538fb4d8696a56b555259e84c8f0c2ad08bcb562273e4b98326de76d5" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.891124 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.901542 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.918712 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:33 crc kubenswrapper[4749]: E0310 16:09:33.919087 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-central-agent" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919104 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-central-agent" Mar 10 16:09:33 crc kubenswrapper[4749]: E0310 16:09:33.919121 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="proxy-httpd" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919128 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="proxy-httpd" Mar 10 16:09:33 crc kubenswrapper[4749]: E0310 16:09:33.919136 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="init" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919141 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="init" Mar 10 16:09:33 crc kubenswrapper[4749]: E0310 16:09:33.919159 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-notification-agent" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919165 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-notification-agent" Mar 10 16:09:33 crc kubenswrapper[4749]: E0310 16:09:33.919180 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="sg-core" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919185 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="sg-core" Mar 10 16:09:33 crc kubenswrapper[4749]: E0310 16:09:33.919203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="dnsmasq-dns" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919209 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="dnsmasq-dns" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919387 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2376b2-8792-4f6e-ba0e-d5cfea25dfdf" containerName="dnsmasq-dns" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919402 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="proxy-httpd" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919409 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-central-agent" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919418 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="ceilometer-notification-agent" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.919428 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" containerName="sg-core" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.920979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.923303 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.923409 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.929659 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.930239 4749 scope.go:117] "RemoveContainer" containerID="aa94feb75acb96142e193e0eeaf17ee8d9f0bbf8d774f64014bb8cc27c9204cd" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.964956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.965022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wzs\" (UniqueName: \"kubernetes.io/projected/57e7b33d-a14d-42aa-838d-158a2a4229a1-kube-api-access-k9wzs\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.965088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-log-httpd\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.965174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-config-data\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.965206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.965272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-run-httpd\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.965299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-scripts\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:33 crc kubenswrapper[4749]: I0310 16:09:33.976039 4749 scope.go:117] "RemoveContainer" containerID="ed18f1b882ca0e625ea1d9e295570112cd67ff4379d310dfc0b3721c89d6483d" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-config-data\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-run-httpd\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-scripts\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wzs\" (UniqueName: \"kubernetes.io/projected/57e7b33d-a14d-42aa-838d-158a2a4229a1-kube-api-access-k9wzs\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.067901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-log-httpd\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.069608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-log-httpd\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.071033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-run-httpd\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.074913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.075427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-scripts\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.075684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.078732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-config-data\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.091531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wzs\" (UniqueName: \"kubernetes.io/projected/57e7b33d-a14d-42aa-838d-158a2a4229a1-kube-api-access-k9wzs\") pod \"ceilometer-0\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.242016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.261300 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.371624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-combined-ca-bundle\") pod \"876272e9-3af8-40ba-aac7-40f8cecc909e\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.372043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/876272e9-3af8-40ba-aac7-40f8cecc909e-etc-machine-id\") pod \"876272e9-3af8-40ba-aac7-40f8cecc909e\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.372128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-scripts\") pod \"876272e9-3af8-40ba-aac7-40f8cecc909e\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.372221 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-db-sync-config-data\") pod \"876272e9-3af8-40ba-aac7-40f8cecc909e\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.372256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-config-data\") pod \"876272e9-3af8-40ba-aac7-40f8cecc909e\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.372295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sg2w\" (UniqueName: \"kubernetes.io/projected/876272e9-3af8-40ba-aac7-40f8cecc909e-kube-api-access-8sg2w\") pod \"876272e9-3af8-40ba-aac7-40f8cecc909e\" (UID: \"876272e9-3af8-40ba-aac7-40f8cecc909e\") " Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.373360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/876272e9-3af8-40ba-aac7-40f8cecc909e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "876272e9-3af8-40ba-aac7-40f8cecc909e" (UID: "876272e9-3af8-40ba-aac7-40f8cecc909e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.379100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-scripts" (OuterVolumeSpecName: "scripts") pod "876272e9-3af8-40ba-aac7-40f8cecc909e" (UID: "876272e9-3af8-40ba-aac7-40f8cecc909e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.379157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876272e9-3af8-40ba-aac7-40f8cecc909e-kube-api-access-8sg2w" (OuterVolumeSpecName: "kube-api-access-8sg2w") pod "876272e9-3af8-40ba-aac7-40f8cecc909e" (UID: "876272e9-3af8-40ba-aac7-40f8cecc909e"). InnerVolumeSpecName "kube-api-access-8sg2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.379952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "876272e9-3af8-40ba-aac7-40f8cecc909e" (UID: "876272e9-3af8-40ba-aac7-40f8cecc909e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.405635 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "876272e9-3af8-40ba-aac7-40f8cecc909e" (UID: "876272e9-3af8-40ba-aac7-40f8cecc909e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.426048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-config-data" (OuterVolumeSpecName: "config-data") pod "876272e9-3af8-40ba-aac7-40f8cecc909e" (UID: "876272e9-3af8-40ba-aac7-40f8cecc909e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.474153 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.474186 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/876272e9-3af8-40ba-aac7-40f8cecc909e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.474195 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.474206 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.474215 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/876272e9-3af8-40ba-aac7-40f8cecc909e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.474223 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sg2w\" (UniqueName: \"kubernetes.io/projected/876272e9-3af8-40ba-aac7-40f8cecc909e-kube-api-access-8sg2w\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:34 crc kubenswrapper[4749]: W0310 16:09:34.687785 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57e7b33d_a14d_42aa_838d_158a2a4229a1.slice/crio-ca4fb8a70014bb4a1a1fa25709752200781a17a73e76b8bcec7ff643540cbdbd WatchSource:0}: Error finding container ca4fb8a70014bb4a1a1fa25709752200781a17a73e76b8bcec7ff643540cbdbd: Status 404 returned error can't find the container with id ca4fb8a70014bb4a1a1fa25709752200781a17a73e76b8bcec7ff643540cbdbd Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.695332 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.844088 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.877921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerStarted","Data":"ca4fb8a70014bb4a1a1fa25709752200781a17a73e76b8bcec7ff643540cbdbd"} Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.881308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff4lh" event={"ID":"876272e9-3af8-40ba-aac7-40f8cecc909e","Type":"ContainerDied","Data":"7365c2c026559dbb7d4c5393b5a1c9b63c97be4bb76f06a335d3080cfa536716"} Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.881802 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7365c2c026559dbb7d4c5393b5a1c9b63c97be4bb76f06a335d3080cfa536716" Mar 10 16:09:34 crc kubenswrapper[4749]: I0310 16:09:34.881821 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff4lh" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.161907 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:35 crc kubenswrapper[4749]: E0310 16:09:35.162269 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876272e9-3af8-40ba-aac7-40f8cecc909e" containerName="cinder-db-sync" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.162280 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="876272e9-3af8-40ba-aac7-40f8cecc909e" containerName="cinder-db-sync" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.162457 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="876272e9-3af8-40ba-aac7-40f8cecc909e" containerName="cinder-db-sync" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.163280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.176735 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qfn9s" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.176997 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.177148 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.177232 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.203584 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.251110 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d4bc56d77-cjnl8"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.251404 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d4bc56d77-cjnl8" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-api" containerID="cri-o://6384072e7eda3838a0f4a2261ffc6756be1dc38df934506afb80c689b299737c" gracePeriod=30 Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.252182 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d4bc56d77-cjnl8" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-httpd" containerID="cri-o://52861deaa6b8da0398f6ad619b2142325877fc700973aeb0570d3800fe2acd5b" gracePeriod=30 Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.275804 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d4bc56d77-cjnl8" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": EOF" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.301685 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-c77bn"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.304049 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.361475 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-c77bn"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.381412 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-658447d949-bwfgt"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.383061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395327 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c12cece-3521-4500-83eb-451ca55c6443-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-nb\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-svc\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395450 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-swift-storage-0\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-config\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5x2\" (UniqueName: \"kubernetes.io/projected/85834161-43ab-465b-bb71-811ed69c132b-kube-api-access-bd5x2\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w8ds\" (UniqueName: \"kubernetes.io/projected/2c12cece-3521-4500-83eb-451ca55c6443-kube-api-access-8w8ds\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.395622 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-sb\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.402443 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-658447d949-bwfgt"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.432926 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.435396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.442324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.464935 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.501920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.501965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.501993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c12cece-3521-4500-83eb-451ca55c6443-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-nb\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-public-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-svc\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502070 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-swift-storage-0\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhf8j\" (UniqueName: \"kubernetes.io/projected/236aa9f6-5238-45de-813d-e0b18c887f64-kube-api-access-dhf8j\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-config\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-httpd-config\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-ovndb-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-internal-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-config\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502217 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5x2\" (UniqueName: \"kubernetes.io/projected/85834161-43ab-465b-bb71-811ed69c132b-kube-api-access-bd5x2\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-combined-ca-bundle\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w8ds\" (UniqueName: \"kubernetes.io/projected/2c12cece-3521-4500-83eb-451ca55c6443-kube-api-access-8w8ds\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.502309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-sb\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.510668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.510749 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c12cece-3521-4500-83eb-451ca55c6443-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.511543 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-nb\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.512233 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-swift-storage-0\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.512775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-svc\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.513291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-config\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.513970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-sb\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.538992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.550975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w8ds\" (UniqueName: \"kubernetes.io/projected/2c12cece-3521-4500-83eb-451ca55c6443-kube-api-access-8w8ds\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.554083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5x2\" (UniqueName: \"kubernetes.io/projected/85834161-43ab-465b-bb71-811ed69c132b-kube-api-access-bd5x2\") pod \"dnsmasq-dns-86dc97b969-c77bn\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.566577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.567134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.578957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.609935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-internal-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.609988 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrxjt\" (UniqueName: \"kubernetes.io/projected/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-kube-api-access-xrxjt\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-scripts\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-logs\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-combined-ca-bundle\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-public-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhf8j\" (UniqueName: \"kubernetes.io/projected/236aa9f6-5238-45de-813d-e0b18c887f64-kube-api-access-dhf8j\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-config\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-httpd-config\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.610328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-ovndb-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.613959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-public-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.620207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-ovndb-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.621001 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-internal-tls-certs\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.621978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-httpd-config\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.622321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-config\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.636042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-combined-ca-bundle\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.639122 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fea9311-0e47-4352-8d4f-ac90db816fc1" path="/var/lib/kubelet/pods/3fea9311-0e47-4352-8d4f-ac90db816fc1/volumes" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.656083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhf8j\" (UniqueName: \"kubernetes.io/projected/236aa9f6-5238-45de-813d-e0b18c887f64-kube-api-access-dhf8j\") pod \"neutron-658447d949-bwfgt\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.676938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.727913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728208 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrxjt\" (UniqueName: \"kubernetes.io/projected/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-kube-api-access-xrxjt\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728580 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728610 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-scripts\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-logs\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.728901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.730476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.730844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-logs\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.740895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-scripts\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.745080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.745529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.745530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.752743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrxjt\" (UniqueName: \"kubernetes.io/projected/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-kube-api-access-xrxjt\") pod \"cinder-api-0\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.767799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:09:35 crc kubenswrapper[4749]: I0310 16:09:35.902121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerStarted","Data":"f843da5b161c55109383cd92b6b7fbfcfaa272d107fef445d38434f8e575b4b2"} Mar 10 16:09:36 crc kubenswrapper[4749]: I0310 16:09:36.131334 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:36 crc kubenswrapper[4749]: I0310 16:09:36.261193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-c77bn"] Mar 10 16:09:36 crc kubenswrapper[4749]: I0310 16:09:36.381573 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:36 crc kubenswrapper[4749]: I0310 16:09:36.481546 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-658447d949-bwfgt"] Mar 10 16:09:36 crc kubenswrapper[4749]: W0310 16:09:36.499036 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236aa9f6_5238_45de_813d_e0b18c887f64.slice/crio-479b4915e2f2b9054cc92ad9b324f5c20b4c5f07585a3ff0f7aa610a89ff5dbb WatchSource:0}: Error finding container 479b4915e2f2b9054cc92ad9b324f5c20b4c5f07585a3ff0f7aa610a89ff5dbb: Status 404 returned error can't find the container with id 479b4915e2f2b9054cc92ad9b324f5c20b4c5f07585a3ff0f7aa610a89ff5dbb Mar 10 16:09:36 crc kubenswrapper[4749]: I0310 16:09:36.975584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c12cece-3521-4500-83eb-451ca55c6443","Type":"ContainerStarted","Data":"374335d5ba2630262cef11a3ca643ca89912b9b971f0f92401ea576de13aecbd"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.017270 4749 generic.go:334] "Generic (PLEG): container finished" podID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerID="52861deaa6b8da0398f6ad619b2142325877fc700973aeb0570d3800fe2acd5b" exitCode=0 Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.017335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d4bc56d77-cjnl8" event={"ID":"0cf0df71-68ed-41f5-8bee-8f2ec91f133f","Type":"ContainerDied","Data":"52861deaa6b8da0398f6ad619b2142325877fc700973aeb0570d3800fe2acd5b"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.043965 4749 generic.go:334] "Generic (PLEG): container finished" podID="85834161-43ab-465b-bb71-811ed69c132b" containerID="892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f" exitCode=0 Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.044047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" event={"ID":"85834161-43ab-465b-bb71-811ed69c132b","Type":"ContainerDied","Data":"892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.044079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" event={"ID":"85834161-43ab-465b-bb71-811ed69c132b","Type":"ContainerStarted","Data":"b3a4311e697c5e0b6418409fe3abfc8d0559e829862b4ecfd946afd7ead56ba6"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.092102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerStarted","Data":"893c341f28a27b00f8cc7d20795aa37f2e98ddba787b335a0965cbc6e649705a"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.107774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-658447d949-bwfgt" event={"ID":"236aa9f6-5238-45de-813d-e0b18c887f64","Type":"ContainerStarted","Data":"1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.107836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-658447d949-bwfgt" event={"ID":"236aa9f6-5238-45de-813d-e0b18c887f64","Type":"ContainerStarted","Data":"479b4915e2f2b9054cc92ad9b324f5c20b4c5f07585a3ff0f7aa610a89ff5dbb"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.125016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457","Type":"ContainerStarted","Data":"11c7eb8ce1a9a1b3ece0b621aea0dfb52a54dc575a99fb6f7a8b2dd564a87c76"} Mar 10 16:09:37 crc kubenswrapper[4749]: I0310 16:09:37.581501 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d4bc56d77-cjnl8" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.180678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457","Type":"ContainerStarted","Data":"8d1628e7975b8bb76ee4da7993ee15bc71c162521802e176a648350c30a7977c"} Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.204821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c12cece-3521-4500-83eb-451ca55c6443","Type":"ContainerStarted","Data":"3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f"} Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.238688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" event={"ID":"85834161-43ab-465b-bb71-811ed69c132b","Type":"ContainerStarted","Data":"8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030"} Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.239129 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.247677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerStarted","Data":"c9cb354e91aa7005db421f2582735683c64a50e2227e2ef383f40d218a041cfc"} Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.267593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-658447d949-bwfgt" event={"ID":"236aa9f6-5238-45de-813d-e0b18c887f64","Type":"ContainerStarted","Data":"44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6"} Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.268392 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.282357 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" podStartSLOduration=3.2823405660000002 podStartE2EDuration="3.282340566s" podCreationTimestamp="2026-03-10 16:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:38.279699243 +0000 UTC m=+1275.401564930" watchObservedRunningTime="2026-03-10 16:09:38.282340566 +0000 UTC m=+1275.404206253" Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.329586 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-658447d949-bwfgt" podStartSLOduration=3.329563118 podStartE2EDuration="3.329563118s" podCreationTimestamp="2026-03-10 16:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:38.315509861 +0000 UTC m=+1275.437375538" watchObservedRunningTime="2026-03-10 16:09:38.329563118 +0000 UTC m=+1275.451428795" Mar 10 16:09:38 crc kubenswrapper[4749]: I0310 16:09:38.563971 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.320488 4749 generic.go:334] "Generic (PLEG): container finished" podID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerID="6384072e7eda3838a0f4a2261ffc6756be1dc38df934506afb80c689b299737c" exitCode=0 Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.321138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d4bc56d77-cjnl8" event={"ID":"0cf0df71-68ed-41f5-8bee-8f2ec91f133f","Type":"ContainerDied","Data":"6384072e7eda3838a0f4a2261ffc6756be1dc38df934506afb80c689b299737c"} Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.321187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d4bc56d77-cjnl8" event={"ID":"0cf0df71-68ed-41f5-8bee-8f2ec91f133f","Type":"ContainerDied","Data":"3df2cecd3588f3c17b90c9f6602510401b4bb02cada25c275a9d4d0b0c1f9595"} Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.321198 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df2cecd3588f3c17b90c9f6602510401b4bb02cada25c275a9d4d0b0c1f9595" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.330555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457","Type":"ContainerStarted","Data":"cdf7cf95dc67b719905cccdfab13f2f24d50799b44e383b9e6f4d7700d9a9bc3"} Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.330788 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api-log" containerID="cri-o://8d1628e7975b8bb76ee4da7993ee15bc71c162521802e176a648350c30a7977c" gracePeriod=30 Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.331026 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api" containerID="cri-o://cdf7cf95dc67b719905cccdfab13f2f24d50799b44e383b9e6f4d7700d9a9bc3" gracePeriod=30 Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.331207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.349456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c12cece-3521-4500-83eb-451ca55c6443","Type":"ContainerStarted","Data":"50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a"} Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.362779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.365020 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.365009287 podStartE2EDuration="4.365009287s" podCreationTimestamp="2026-03-10 16:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:39.363792034 +0000 UTC m=+1276.485657721" watchObservedRunningTime="2026-03-10 16:09:39.365009287 +0000 UTC m=+1276.486874974" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.397883 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.583464506 podStartE2EDuration="4.397866973s" podCreationTimestamp="2026-03-10 16:09:35 +0000 UTC" firstStartedPulling="2026-03-10 16:09:36.139777901 +0000 UTC m=+1273.261643588" lastFinishedPulling="2026-03-10 16:09:36.954180368 +0000 UTC m=+1274.076046055" observedRunningTime="2026-03-10 16:09:39.389079061 +0000 UTC m=+1276.510944748" watchObservedRunningTime="2026-03-10 16:09:39.397866973 +0000 UTC m=+1276.519732660" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462035 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-combined-ca-bundle\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-config\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-httpd-config\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4b6m\" (UniqueName: \"kubernetes.io/projected/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-kube-api-access-b4b6m\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-ovndb-tls-certs\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-public-tls-certs\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.462272 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-internal-tls-certs\") pod \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\" (UID: \"0cf0df71-68ed-41f5-8bee-8f2ec91f133f\") " Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.485947 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.510743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-kube-api-access-b4b6m" (OuterVolumeSpecName: "kube-api-access-b4b6m") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "kube-api-access-b4b6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.564318 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.564359 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4b6m\" (UniqueName: \"kubernetes.io/projected/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-kube-api-access-b4b6m\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.657710 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.663345 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-config" (OuterVolumeSpecName: "config") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.665533 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.665561 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.676717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.676906 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.743427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0cf0df71-68ed-41f5-8bee-8f2ec91f133f" (UID: "0cf0df71-68ed-41f5-8bee-8f2ec91f133f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.766859 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.766887 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:39 crc kubenswrapper[4749]: I0310 16:09:39.766896 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf0df71-68ed-41f5-8bee-8f2ec91f133f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.032716 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.358762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerStarted","Data":"9aeff0174de3e22e208a8c6418d5fbd0bb88c6f7a4980e4e32345580b8482cd6"} Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.360272 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.362785 4749 generic.go:334] "Generic (PLEG): container finished" podID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerID="cdf7cf95dc67b719905cccdfab13f2f24d50799b44e383b9e6f4d7700d9a9bc3" exitCode=0 Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.362817 4749 generic.go:334] "Generic (PLEG): container finished" podID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerID="8d1628e7975b8bb76ee4da7993ee15bc71c162521802e176a648350c30a7977c" exitCode=143 Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.362906 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d4bc56d77-cjnl8" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.367771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457","Type":"ContainerDied","Data":"cdf7cf95dc67b719905cccdfab13f2f24d50799b44e383b9e6f4d7700d9a9bc3"} Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.367816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457","Type":"ContainerDied","Data":"8d1628e7975b8bb76ee4da7993ee15bc71c162521802e176a648350c30a7977c"} Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.367862 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.396174 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.088827453 podStartE2EDuration="7.396151118s" podCreationTimestamp="2026-03-10 16:09:33 +0000 UTC" firstStartedPulling="2026-03-10 16:09:34.689881077 +0000 UTC m=+1271.811746764" lastFinishedPulling="2026-03-10 16:09:39.997204742 +0000 UTC m=+1277.119070429" observedRunningTime="2026-03-10 16:09:40.387463349 +0000 UTC m=+1277.509329036" watchObservedRunningTime="2026-03-10 16:09:40.396151118 +0000 UTC m=+1277.518016805" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.457575 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d4bc56d77-cjnl8"] Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.464529 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d4bc56d77-cjnl8"] Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.472641 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.583050 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79cc84f6-m4lp5"] Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.583268 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79cc84f6-m4lp5" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api-log" containerID="cri-o://619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc" gracePeriod=30 Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.583675 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79cc84f6-m4lp5" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api" containerID="cri-o://a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644" gracePeriod=30 Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.585617 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 16:09:40 crc kubenswrapper[4749]: I0310 16:09:40.944575 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-logs\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-scripts\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrxjt\" (UniqueName: \"kubernetes.io/projected/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-kube-api-access-xrxjt\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-combined-ca-bundle\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data-custom\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.014900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-etc-machine-id\") pod \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\" (UID: \"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457\") " Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.015275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.015801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-logs" (OuterVolumeSpecName: "logs") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.026522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-scripts" (OuterVolumeSpecName: "scripts") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.030549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-kube-api-access-xrxjt" (OuterVolumeSpecName: "kube-api-access-xrxjt") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "kube-api-access-xrxjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.055699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.086328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.116860 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrxjt\" (UniqueName: \"kubernetes.io/projected/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-kube-api-access-xrxjt\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.116909 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.116922 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.116933 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.116948 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.116960 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.118399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data" (OuterVolumeSpecName: "config-data") pod "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" (UID: "0e6ab430-bd5f-4f8a-96d6-2ee0b5507457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.218922 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.376105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e6ab430-bd5f-4f8a-96d6-2ee0b5507457","Type":"ContainerDied","Data":"11c7eb8ce1a9a1b3ece0b621aea0dfb52a54dc575a99fb6f7a8b2dd564a87c76"} Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.376216 4749 scope.go:117] "RemoveContainer" containerID="cdf7cf95dc67b719905cccdfab13f2f24d50799b44e383b9e6f4d7700d9a9bc3" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.376140 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.378596 4749 generic.go:334] "Generic (PLEG): container finished" podID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerID="619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc" exitCode=143 Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.379841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79cc84f6-m4lp5" event={"ID":"e775d653-0d4c-4cb9-bed2-962b6589c5ee","Type":"ContainerDied","Data":"619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc"} Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.408546 4749 scope.go:117] "RemoveContainer" containerID="8d1628e7975b8bb76ee4da7993ee15bc71c162521802e176a648350c30a7977c" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.453459 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.482698 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.506640 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:41 crc kubenswrapper[4749]: E0310 16:09:41.507334 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-api" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.507363 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-api" Mar 10 16:09:41 crc kubenswrapper[4749]: E0310 16:09:41.507402 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api-log" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.507413 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api-log" Mar 10 16:09:41 crc kubenswrapper[4749]: E0310 16:09:41.507438 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.507450 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api" Mar 10 16:09:41 crc kubenswrapper[4749]: E0310 16:09:41.509050 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-httpd" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.509112 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-httpd" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.509521 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-api" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.509543 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" containerName="neutron-httpd" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.509554 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api-log" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.509572 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" containerName="cinder-api" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.510822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.521369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.525832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.525886 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.525968 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.571061 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.628570 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf0df71-68ed-41f5-8bee-8f2ec91f133f" path="/var/lib/kubelet/pods/0cf0df71-68ed-41f5-8bee-8f2ec91f133f/volumes" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.629448 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6ab430-bd5f-4f8a-96d6-2ee0b5507457" path="/var/lib/kubelet/pods/0e6ab430-bd5f-4f8a-96d6-2ee0b5507457/volumes" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.632803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d845ea-a98a-43ae-9803-30e5d306d29d-logs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-scripts\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ppp\" (UniqueName: \"kubernetes.io/projected/a0d845ea-a98a-43ae-9803-30e5d306d29d-kube-api-access-j2ppp\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0d845ea-a98a-43ae-9803-30e5d306d29d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.633936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0d845ea-a98a-43ae-9803-30e5d306d29d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d845ea-a98a-43ae-9803-30e5d306d29d-logs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-scripts\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.735837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ppp\" (UniqueName: \"kubernetes.io/projected/a0d845ea-a98a-43ae-9803-30e5d306d29d-kube-api-access-j2ppp\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.737451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d845ea-a98a-43ae-9803-30e5d306d29d-logs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.738637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0d845ea-a98a-43ae-9803-30e5d306d29d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.758079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-scripts\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.758315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.760160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.760754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.766239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.771365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.786044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ppp\" (UniqueName: \"kubernetes.io/projected/a0d845ea-a98a-43ae-9803-30e5d306d29d-kube-api-access-j2ppp\") pod \"cinder-api-0\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.834935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.873478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8dfcffcf6-962bk"] Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.874855 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.899687 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8dfcffcf6-962bk"] Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-combined-ca-bundle\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg9sc\" (UniqueName: \"kubernetes.io/projected/7cc64163-530a-4b31-9acc-84910336b781-kube-api-access-tg9sc\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-scripts\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cc64163-530a-4b31-9acc-84910336b781-logs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-public-tls-certs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-config-data\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:41 crc kubenswrapper[4749]: I0310 16:09:41.940715 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-internal-tls-certs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-scripts\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cc64163-530a-4b31-9acc-84910336b781-logs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-public-tls-certs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-config-data\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-internal-tls-certs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-combined-ca-bundle\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.044971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg9sc\" (UniqueName: \"kubernetes.io/projected/7cc64163-530a-4b31-9acc-84910336b781-kube-api-access-tg9sc\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.053533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cc64163-530a-4b31-9acc-84910336b781-logs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.059727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-combined-ca-bundle\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.067268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-scripts\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.067607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-internal-tls-certs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.068280 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-config-data\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.068476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-public-tls-certs\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.086324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg9sc\" (UniqueName: \"kubernetes.io/projected/7cc64163-530a-4b31-9acc-84910336b781-kube-api-access-tg9sc\") pod \"placement-8dfcffcf6-962bk\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.258421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.260669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.440913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0d845ea-a98a-43ae-9803-30e5d306d29d","Type":"ContainerStarted","Data":"c097dd8012cac30d05e75fe83efdca1aaa8a554789942b137dbade44d1339656"} Mar 10 16:09:42 crc kubenswrapper[4749]: I0310 16:09:42.780223 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8dfcffcf6-962bk"] Mar 10 16:09:42 crc kubenswrapper[4749]: W0310 16:09:42.795107 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cc64163_530a_4b31_9acc_84910336b781.slice/crio-b37ba40d7206c0af8fcc7ac5985e11349cbc974a49612fd2dc99c32c5aa1f203 WatchSource:0}: Error finding container b37ba40d7206c0af8fcc7ac5985e11349cbc974a49612fd2dc99c32c5aa1f203: Status 404 returned error can't find the container with id b37ba40d7206c0af8fcc7ac5985e11349cbc974a49612fd2dc99c32c5aa1f203 Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.468455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0d845ea-a98a-43ae-9803-30e5d306d29d","Type":"ContainerStarted","Data":"0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c"} Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.470630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8dfcffcf6-962bk" event={"ID":"7cc64163-530a-4b31-9acc-84910336b781","Type":"ContainerStarted","Data":"57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4"} Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.470829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8dfcffcf6-962bk" event={"ID":"7cc64163-530a-4b31-9acc-84910336b781","Type":"ContainerStarted","Data":"e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2"} Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.470840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8dfcffcf6-962bk" event={"ID":"7cc64163-530a-4b31-9acc-84910336b781","Type":"ContainerStarted","Data":"b37ba40d7206c0af8fcc7ac5985e11349cbc974a49612fd2dc99c32c5aa1f203"} Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.472195 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.472222 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.505046 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8dfcffcf6-962bk" podStartSLOduration=2.505005037 podStartE2EDuration="2.505005037s" podCreationTimestamp="2026-03-10 16:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:43.491567566 +0000 UTC m=+1280.613433253" watchObservedRunningTime="2026-03-10 16:09:43.505005037 +0000 UTC m=+1280.626870724" Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.743626 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79cc84f6-m4lp5" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:46054->10.217.0.163:9311: read: connection reset by peer" Mar 10 16:09:43 crc kubenswrapper[4749]: I0310 16:09:43.743643 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79cc84f6-m4lp5" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:46052->10.217.0.163:9311: read: connection reset by peer" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.222106 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.306268 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9pt\" (UniqueName: \"kubernetes.io/projected/e775d653-0d4c-4cb9-bed2-962b6589c5ee-kube-api-access-sp9pt\") pod \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.306348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data-custom\") pod \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.306641 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-combined-ca-bundle\") pod \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.306689 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data\") pod \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.306716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e775d653-0d4c-4cb9-bed2-962b6589c5ee-logs\") pod \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\" (UID: \"e775d653-0d4c-4cb9-bed2-962b6589c5ee\") " Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.307573 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e775d653-0d4c-4cb9-bed2-962b6589c5ee-logs" (OuterVolumeSpecName: "logs") pod "e775d653-0d4c-4cb9-bed2-962b6589c5ee" (UID: "e775d653-0d4c-4cb9-bed2-962b6589c5ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.328562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e775d653-0d4c-4cb9-bed2-962b6589c5ee" (UID: "e775d653-0d4c-4cb9-bed2-962b6589c5ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.328968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e775d653-0d4c-4cb9-bed2-962b6589c5ee-kube-api-access-sp9pt" (OuterVolumeSpecName: "kube-api-access-sp9pt") pod "e775d653-0d4c-4cb9-bed2-962b6589c5ee" (UID: "e775d653-0d4c-4cb9-bed2-962b6589c5ee"). InnerVolumeSpecName "kube-api-access-sp9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.337489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e775d653-0d4c-4cb9-bed2-962b6589c5ee" (UID: "e775d653-0d4c-4cb9-bed2-962b6589c5ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.357109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data" (OuterVolumeSpecName: "config-data") pod "e775d653-0d4c-4cb9-bed2-962b6589c5ee" (UID: "e775d653-0d4c-4cb9-bed2-962b6589c5ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.409231 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp9pt\" (UniqueName: \"kubernetes.io/projected/e775d653-0d4c-4cb9-bed2-962b6589c5ee-kube-api-access-sp9pt\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.409286 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.409300 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.409313 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e775d653-0d4c-4cb9-bed2-962b6589c5ee-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.409327 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e775d653-0d4c-4cb9-bed2-962b6589c5ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.480280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0d845ea-a98a-43ae-9803-30e5d306d29d","Type":"ContainerStarted","Data":"f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda"} Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.481600 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.483660 4749 generic.go:334] "Generic (PLEG): container finished" podID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerID="a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644" exitCode=0 Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.484405 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79cc84f6-m4lp5" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.485434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79cc84f6-m4lp5" event={"ID":"e775d653-0d4c-4cb9-bed2-962b6589c5ee","Type":"ContainerDied","Data":"a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644"} Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.485468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79cc84f6-m4lp5" event={"ID":"e775d653-0d4c-4cb9-bed2-962b6589c5ee","Type":"ContainerDied","Data":"4764a6089ad0a91f0547616b14277ed00d7574ad2d44d3b2cf723850a20f7504"} Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.485486 4749 scope.go:117] "RemoveContainer" containerID="a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.517107 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.517080892 podStartE2EDuration="3.517080892s" podCreationTimestamp="2026-03-10 16:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:44.500844935 +0000 UTC m=+1281.622710622" watchObservedRunningTime="2026-03-10 16:09:44.517080892 +0000 UTC m=+1281.638946579" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.528882 4749 scope.go:117] "RemoveContainer" containerID="619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.536186 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79cc84f6-m4lp5"] Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.547003 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79cc84f6-m4lp5"] Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.566996 4749 scope.go:117] "RemoveContainer" containerID="a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644" Mar 10 16:09:44 crc kubenswrapper[4749]: E0310 16:09:44.567625 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644\": container with ID starting with a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644 not found: ID does not exist" containerID="a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.567748 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644"} err="failed to get container status \"a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644\": rpc error: code = NotFound desc = could not find container \"a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644\": container with ID starting with a88e72babc8dda57d1d190cf18a2aec4785b2e997c29edc6c39bbc521fee4644 not found: ID does not exist" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.567823 4749 scope.go:117] "RemoveContainer" containerID="619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc" Mar 10 16:09:44 crc kubenswrapper[4749]: E0310 16:09:44.568313 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc\": container with ID starting with 619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc not found: ID does not exist" containerID="619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.568347 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc"} err="failed to get container status \"619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc\": rpc error: code = NotFound desc = could not find container \"619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc\": container with ID starting with 619a4c74421300f1971be5a496feab096cb99d72a4bb333b9d16ae13437beebc not found: ID does not exist" Mar 10 16:09:44 crc kubenswrapper[4749]: I0310 16:09:44.611791 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:09:45 crc kubenswrapper[4749]: I0310 16:09:45.619025 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" path="/var/lib/kubelet/pods/e775d653-0d4c-4cb9-bed2-962b6589c5ee/volumes" Mar 10 16:09:45 crc kubenswrapper[4749]: I0310 16:09:45.678611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:09:45 crc kubenswrapper[4749]: I0310 16:09:45.758799 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dx4gn"] Mar 10 16:09:45 crc kubenswrapper[4749]: I0310 16:09:45.759233 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerName="dnsmasq-dns" containerID="cri-o://93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4" gracePeriod=10 Mar 10 16:09:45 crc kubenswrapper[4749]: I0310 16:09:45.842359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 16:09:45 crc kubenswrapper[4749]: I0310 16:09:45.903821 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.383457 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.473135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-svc\") pod \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.473393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-sb\") pod \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.473545 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-swift-storage-0\") pod \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.473642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hflch\" (UniqueName: \"kubernetes.io/projected/db16e75b-2bca-4ba2-a169-146ceb4ab23d-kube-api-access-hflch\") pod \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.473681 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-config\") pod \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.473727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-nb\") pod \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\" (UID: \"db16e75b-2bca-4ba2-a169-146ceb4ab23d\") " Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.515055 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db16e75b-2bca-4ba2-a169-146ceb4ab23d-kube-api-access-hflch" (OuterVolumeSpecName: "kube-api-access-hflch") pod "db16e75b-2bca-4ba2-a169-146ceb4ab23d" (UID: "db16e75b-2bca-4ba2-a169-146ceb4ab23d"). InnerVolumeSpecName "kube-api-access-hflch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.533403 4749 generic.go:334] "Generic (PLEG): container finished" podID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerID="93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4" exitCode=0 Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.533628 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="cinder-scheduler" containerID="cri-o://3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f" gracePeriod=30 Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.533931 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.534979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" event={"ID":"db16e75b-2bca-4ba2-a169-146ceb4ab23d","Type":"ContainerDied","Data":"93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4"} Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.535015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b78c5c5d5-dx4gn" event={"ID":"db16e75b-2bca-4ba2-a169-146ceb4ab23d","Type":"ContainerDied","Data":"c7cb399ede7ff4559fb9110dcfe3943e07272c0f6149b1c48493d85abe3881a0"} Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.535035 4749 scope.go:117] "RemoveContainer" containerID="93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.536444 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="probe" containerID="cri-o://50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a" gracePeriod=30 Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.549388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "db16e75b-2bca-4ba2-a169-146ceb4ab23d" (UID: "db16e75b-2bca-4ba2-a169-146ceb4ab23d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.552691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db16e75b-2bca-4ba2-a169-146ceb4ab23d" (UID: "db16e75b-2bca-4ba2-a169-146ceb4ab23d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.576595 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.576626 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hflch\" (UniqueName: \"kubernetes.io/projected/db16e75b-2bca-4ba2-a169-146ceb4ab23d-kube-api-access-hflch\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.576640 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.588248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db16e75b-2bca-4ba2-a169-146ceb4ab23d" (UID: "db16e75b-2bca-4ba2-a169-146ceb4ab23d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.599900 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db16e75b-2bca-4ba2-a169-146ceb4ab23d" (UID: "db16e75b-2bca-4ba2-a169-146ceb4ab23d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.602218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-config" (OuterVolumeSpecName: "config") pod "db16e75b-2bca-4ba2-a169-146ceb4ab23d" (UID: "db16e75b-2bca-4ba2-a169-146ceb4ab23d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.646855 4749 scope.go:117] "RemoveContainer" containerID="d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.677199 4749 scope.go:117] "RemoveContainer" containerID="93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4" Mar 10 16:09:46 crc kubenswrapper[4749]: E0310 16:09:46.678034 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4\": container with ID starting with 93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4 not found: ID does not exist" containerID="93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.678077 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4"} err="failed to get container status \"93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4\": rpc error: code = NotFound desc = could not find container \"93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4\": container with ID starting with 93cfecc776f3603dcd53cbbdc192667e2b1b8e3afd6f288f476d3df78fe4e6a4 not found: ID does not exist" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.678101 4749 scope.go:117] "RemoveContainer" containerID="d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938" Mar 10 16:09:46 crc kubenswrapper[4749]: E0310 16:09:46.678332 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938\": container with ID starting with d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938 not found: ID does not exist" containerID="d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.678361 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938"} err="failed to get container status \"d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938\": rpc error: code = NotFound desc = could not find container \"d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938\": container with ID starting with d5c31d8204fbcc96edeb84f0f339f93b7bd398defe79854591afa15d64eec938 not found: ID does not exist" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.680435 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.680459 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.680471 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db16e75b-2bca-4ba2-a169-146ceb4ab23d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.880578 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dx4gn"] Mar 10 16:09:46 crc kubenswrapper[4749]: I0310 16:09:46.890727 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b78c5c5d5-dx4gn"] Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023036 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 16:09:47 crc kubenswrapper[4749]: E0310 16:09:47.023359 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerName="dnsmasq-dns" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023391 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerName="dnsmasq-dns" Mar 10 16:09:47 crc kubenswrapper[4749]: E0310 16:09:47.023410 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api-log" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023416 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api-log" Mar 10 16:09:47 crc kubenswrapper[4749]: E0310 16:09:47.023442 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023448 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api" Mar 10 16:09:47 crc kubenswrapper[4749]: E0310 16:09:47.023458 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerName="init" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023464 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerName="init" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023625 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" containerName="dnsmasq-dns" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023650 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api-log" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.023662 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e775d653-0d4c-4cb9-bed2-962b6589c5ee" containerName="barbican-api" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.024317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.031176 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.031242 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zzpzs" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.041240 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.045367 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.190583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.190929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmx2k\" (UniqueName: \"kubernetes.io/projected/bdf02d6a-5794-4b1d-b155-f683bdb8680d-kube-api-access-hmx2k\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.191058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config-secret\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.191168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.292994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.293297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmx2k\" (UniqueName: \"kubernetes.io/projected/bdf02d6a-5794-4b1d-b155-f683bdb8680d-kube-api-access-hmx2k\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.293415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config-secret\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.293538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.293897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.297741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config-secret\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.297947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.310140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmx2k\" (UniqueName: \"kubernetes.io/projected/bdf02d6a-5794-4b1d-b155-f683bdb8680d-kube-api-access-hmx2k\") pod \"openstackclient\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.351262 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.547239 4749 generic.go:334] "Generic (PLEG): container finished" podID="2c12cece-3521-4500-83eb-451ca55c6443" containerID="50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a" exitCode=0 Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.547441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c12cece-3521-4500-83eb-451ca55c6443","Type":"ContainerDied","Data":"50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a"} Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.618031 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db16e75b-2bca-4ba2-a169-146ceb4ab23d" path="/var/lib/kubelet/pods/db16e75b-2bca-4ba2-a169-146ceb4ab23d/volumes" Mar 10 16:09:47 crc kubenswrapper[4749]: I0310 16:09:47.859555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 16:09:48 crc kubenswrapper[4749]: I0310 16:09:48.559607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bdf02d6a-5794-4b1d-b155-f683bdb8680d","Type":"ContainerStarted","Data":"02e4e4260cd0292c4be2ba0bc26efa024a8a4b47215ad8b9de889654cec14fcf"} Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.639331 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-67dd78ff7-qfbxb"] Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.641785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.657645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.657808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.662380 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.667741 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67dd78ff7-qfbxb"] Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.759608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-public-tls-certs\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.759683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-etc-swift\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.759721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-config-data\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.759750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-log-httpd\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.759782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-combined-ca-bundle\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.759799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6h96\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-kube-api-access-r6h96\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.760572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-run-httpd\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.760647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-internal-tls-certs\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-log-httpd\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-combined-ca-bundle\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6h96\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-kube-api-access-r6h96\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-run-httpd\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-internal-tls-certs\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-public-tls-certs\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-etc-swift\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.862945 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-config-data\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.863729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-run-httpd\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.864660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-log-httpd\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.870174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-etc-swift\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.871618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-public-tls-certs\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.872588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-config-data\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.876639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-combined-ca-bundle\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.877032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-internal-tls-certs\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.887193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6h96\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-kube-api-access-r6h96\") pod \"swift-proxy-67dd78ff7-qfbxb\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.980863 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.980986 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:09:50 crc kubenswrapper[4749]: I0310 16:09:50.987660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.571247 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.592435 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67dd78ff7-qfbxb"] Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.599087 4749 generic.go:334] "Generic (PLEG): container finished" podID="2c12cece-3521-4500-83eb-451ca55c6443" containerID="3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f" exitCode=0 Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.599129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c12cece-3521-4500-83eb-451ca55c6443","Type":"ContainerDied","Data":"3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f"} Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.599156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c12cece-3521-4500-83eb-451ca55c6443","Type":"ContainerDied","Data":"374335d5ba2630262cef11a3ca643ca89912b9b971f0f92401ea576de13aecbd"} Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.599173 4749 scope.go:117] "RemoveContainer" containerID="50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.599295 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692025 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data\") pod \"2c12cece-3521-4500-83eb-451ca55c6443\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692096 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-combined-ca-bundle\") pod \"2c12cece-3521-4500-83eb-451ca55c6443\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692205 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w8ds\" (UniqueName: \"kubernetes.io/projected/2c12cece-3521-4500-83eb-451ca55c6443-kube-api-access-8w8ds\") pod \"2c12cece-3521-4500-83eb-451ca55c6443\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-scripts\") pod \"2c12cece-3521-4500-83eb-451ca55c6443\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data-custom\") pod \"2c12cece-3521-4500-83eb-451ca55c6443\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c12cece-3521-4500-83eb-451ca55c6443-etc-machine-id\") pod \"2c12cece-3521-4500-83eb-451ca55c6443\" (UID: \"2c12cece-3521-4500-83eb-451ca55c6443\") " Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.692529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c12cece-3521-4500-83eb-451ca55c6443-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2c12cece-3521-4500-83eb-451ca55c6443" (UID: "2c12cece-3521-4500-83eb-451ca55c6443"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.694034 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c12cece-3521-4500-83eb-451ca55c6443-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.699548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-scripts" (OuterVolumeSpecName: "scripts") pod "2c12cece-3521-4500-83eb-451ca55c6443" (UID: "2c12cece-3521-4500-83eb-451ca55c6443"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.699588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c12cece-3521-4500-83eb-451ca55c6443-kube-api-access-8w8ds" (OuterVolumeSpecName: "kube-api-access-8w8ds") pod "2c12cece-3521-4500-83eb-451ca55c6443" (UID: "2c12cece-3521-4500-83eb-451ca55c6443"). InnerVolumeSpecName "kube-api-access-8w8ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.700418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2c12cece-3521-4500-83eb-451ca55c6443" (UID: "2c12cece-3521-4500-83eb-451ca55c6443"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.757961 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c12cece-3521-4500-83eb-451ca55c6443" (UID: "2c12cece-3521-4500-83eb-451ca55c6443"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.799580 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.799887 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.799908 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.799919 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w8ds\" (UniqueName: \"kubernetes.io/projected/2c12cece-3521-4500-83eb-451ca55c6443-kube-api-access-8w8ds\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.823939 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data" (OuterVolumeSpecName: "config-data") pod "2c12cece-3521-4500-83eb-451ca55c6443" (UID: "2c12cece-3521-4500-83eb-451ca55c6443"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.903552 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12cece-3521-4500-83eb-451ca55c6443-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.967933 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:51 crc kubenswrapper[4749]: I0310 16:09:51.979951 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.005389 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:52 crc kubenswrapper[4749]: E0310 16:09:52.005964 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="cinder-scheduler" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.005985 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="cinder-scheduler" Mar 10 16:09:52 crc kubenswrapper[4749]: E0310 16:09:52.006001 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="probe" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.006008 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="probe" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.006209 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="cinder-scheduler" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.006231 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12cece-3521-4500-83eb-451ca55c6443" containerName="probe" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.007560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.013858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.025326 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.106731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwmjd\" (UniqueName: \"kubernetes.io/projected/01351004-ea7d-4973-9dd2-859022a35edb-kube-api-access-mwmjd\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.106799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.106870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-scripts\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.106937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01351004-ea7d-4973-9dd2-859022a35edb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.106969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.107007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.174897 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.175200 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-central-agent" containerID="cri-o://f843da5b161c55109383cd92b6b7fbfcfaa272d107fef445d38434f8e575b4b2" gracePeriod=30 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.175241 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="proxy-httpd" containerID="cri-o://9aeff0174de3e22e208a8c6418d5fbd0bb88c6f7a4980e4e32345580b8482cd6" gracePeriod=30 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.175396 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="sg-core" containerID="cri-o://c9cb354e91aa7005db421f2582735683c64a50e2227e2ef383f40d218a041cfc" gracePeriod=30 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.175442 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-notification-agent" containerID="cri-o://893c341f28a27b00f8cc7d20795aa37f2e98ddba787b335a0965cbc6e649705a" gracePeriod=30 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.198687 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwmjd\" (UniqueName: \"kubernetes.io/projected/01351004-ea7d-4973-9dd2-859022a35edb-kube-api-access-mwmjd\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-scripts\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01351004-ea7d-4973-9dd2-859022a35edb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.208516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01351004-ea7d-4973-9dd2-859022a35edb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.214168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.216166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.216501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.225308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-scripts\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.240099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwmjd\" (UniqueName: \"kubernetes.io/projected/01351004-ea7d-4973-9dd2-859022a35edb-kube-api-access-mwmjd\") pod \"cinder-scheduler-0\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.337769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.611641 4749 generic.go:334] "Generic (PLEG): container finished" podID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerID="9aeff0174de3e22e208a8c6418d5fbd0bb88c6f7a4980e4e32345580b8482cd6" exitCode=0 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.612127 4749 generic.go:334] "Generic (PLEG): container finished" podID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerID="c9cb354e91aa7005db421f2582735683c64a50e2227e2ef383f40d218a041cfc" exitCode=2 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.612213 4749 generic.go:334] "Generic (PLEG): container finished" podID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerID="f843da5b161c55109383cd92b6b7fbfcfaa272d107fef445d38434f8e575b4b2" exitCode=0 Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.611855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerDied","Data":"9aeff0174de3e22e208a8c6418d5fbd0bb88c6f7a4980e4e32345580b8482cd6"} Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.612345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerDied","Data":"c9cb354e91aa7005db421f2582735683c64a50e2227e2ef383f40d218a041cfc"} Mar 10 16:09:52 crc kubenswrapper[4749]: I0310 16:09:52.612443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerDied","Data":"f843da5b161c55109383cd92b6b7fbfcfaa272d107fef445d38434f8e575b4b2"} Mar 10 16:09:53 crc kubenswrapper[4749]: I0310 16:09:53.624064 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c12cece-3521-4500-83eb-451ca55c6443" path="/var/lib/kubelet/pods/2c12cece-3521-4500-83eb-451ca55c6443/volumes" Mar 10 16:09:53 crc kubenswrapper[4749]: I0310 16:09:53.630229 4749 generic.go:334] "Generic (PLEG): container finished" podID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerID="893c341f28a27b00f8cc7d20795aa37f2e98ddba787b335a0965cbc6e649705a" exitCode=0 Mar 10 16:09:53 crc kubenswrapper[4749]: I0310 16:09:53.630277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerDied","Data":"893c341f28a27b00f8cc7d20795aa37f2e98ddba787b335a0965cbc6e649705a"} Mar 10 16:09:53 crc kubenswrapper[4749]: I0310 16:09:53.802712 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 16:09:57 crc kubenswrapper[4749]: W0310 16:09:57.280705 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852c97ea_349d_4262_b36c_2ef7aa81ae75.slice/crio-7e49d1de40d2ec08bea683ec20f01f5bf6b672b490dd965c02c35e2420ea56a1 WatchSource:0}: Error finding container 7e49d1de40d2ec08bea683ec20f01f5bf6b672b490dd965c02c35e2420ea56a1: Status 404 returned error can't find the container with id 7e49d1de40d2ec08bea683ec20f01f5bf6b672b490dd965c02c35e2420ea56a1 Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.395744 4749 scope.go:117] "RemoveContainer" containerID="3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.551628 4749 scope.go:117] "RemoveContainer" containerID="50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a" Mar 10 16:09:57 crc kubenswrapper[4749]: E0310 16:09:57.552449 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a\": container with ID starting with 50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a not found: ID does not exist" containerID="50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.552495 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a"} err="failed to get container status \"50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a\": rpc error: code = NotFound desc = could not find container \"50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a\": container with ID starting with 50134020b3b66e6f9766a1f6645379f6c4e994e3f3b677bafe82fac9cb3ec48a not found: ID does not exist" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.552521 4749 scope.go:117] "RemoveContainer" containerID="3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f" Mar 10 16:09:57 crc kubenswrapper[4749]: E0310 16:09:57.552730 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f\": container with ID starting with 3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f not found: ID does not exist" containerID="3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.552745 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f"} err="failed to get container status \"3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f\": rpc error: code = NotFound desc = could not find container \"3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f\": container with ID starting with 3be29bc06bd62d1df71e7532448edba35ae1e450f2c14289d1227be502bf255f not found: ID does not exist" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.691632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67dd78ff7-qfbxb" event={"ID":"852c97ea-349d-4262-b36c-2ef7aa81ae75","Type":"ContainerStarted","Data":"aaeec1a32c2eac40dd562bef0ec6d26e9e7922c02915dd496d085b035d04bd0f"} Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.692254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67dd78ff7-qfbxb" event={"ID":"852c97ea-349d-4262-b36c-2ef7aa81ae75","Type":"ContainerStarted","Data":"7e49d1de40d2ec08bea683ec20f01f5bf6b672b490dd965c02c35e2420ea56a1"} Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.695857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bdf02d6a-5794-4b1d-b155-f683bdb8680d","Type":"ContainerStarted","Data":"217e914770081e2bccbbae1cf847983e071573c618440c545a24e7fc9b20a92f"} Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.729917 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.209742303 podStartE2EDuration="11.729900334s" podCreationTimestamp="2026-03-10 16:09:46 +0000 UTC" firstStartedPulling="2026-03-10 16:09:47.896794017 +0000 UTC m=+1285.018659714" lastFinishedPulling="2026-03-10 16:09:57.416952058 +0000 UTC m=+1294.538817745" observedRunningTime="2026-03-10 16:09:57.720863785 +0000 UTC m=+1294.842729472" watchObservedRunningTime="2026-03-10 16:09:57.729900334 +0000 UTC m=+1294.851766021" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.736502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834020 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-log-httpd\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834140 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-sg-core-conf-yaml\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9wzs\" (UniqueName: \"kubernetes.io/projected/57e7b33d-a14d-42aa-838d-158a2a4229a1-kube-api-access-k9wzs\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834314 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-run-httpd\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-config-data\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-combined-ca-bundle\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834494 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-scripts\") pod \"57e7b33d-a14d-42aa-838d-158a2a4229a1\" (UID: \"57e7b33d-a14d-42aa-838d-158a2a4229a1\") " Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.834940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.835013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.841553 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-scripts" (OuterVolumeSpecName: "scripts") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.841582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e7b33d-a14d-42aa-838d-158a2a4229a1-kube-api-access-k9wzs" (OuterVolumeSpecName: "kube-api-access-k9wzs") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "kube-api-access-k9wzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.877771 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.914395 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.930424 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.936453 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.936480 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9wzs\" (UniqueName: \"kubernetes.io/projected/57e7b33d-a14d-42aa-838d-158a2a4229a1-kube-api-access-k9wzs\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.936490 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.936500 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.936511 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.936518 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e7b33d-a14d-42aa-838d-158a2a4229a1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:57 crc kubenswrapper[4749]: I0310 16:09:57.950589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-config-data" (OuterVolumeSpecName: "config-data") pod "57e7b33d-a14d-42aa-838d-158a2a4229a1" (UID: "57e7b33d-a14d-42aa-838d-158a2a4229a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.038640 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e7b33d-a14d-42aa-838d-158a2a4229a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.679253 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hqhhg"] Mar 10 16:09:58 crc kubenswrapper[4749]: E0310 16:09:58.680329 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="proxy-httpd" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680344 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="proxy-httpd" Mar 10 16:09:58 crc kubenswrapper[4749]: E0310 16:09:58.680364 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="sg-core" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680370 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="sg-core" Mar 10 16:09:58 crc kubenswrapper[4749]: E0310 16:09:58.680499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-notification-agent" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680507 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-notification-agent" Mar 10 16:09:58 crc kubenswrapper[4749]: E0310 16:09:58.680522 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-central-agent" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680528 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-central-agent" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680692 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-central-agent" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680707 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="sg-core" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680719 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="ceilometer-notification-agent" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.680730 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" containerName="proxy-httpd" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.681354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.693602 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hqhhg"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.717164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"01351004-ea7d-4973-9dd2-859022a35edb","Type":"ContainerStarted","Data":"979fafe5fb14a8e96ba3c95974f251f5e9ed6197a8ab83b091aa994aadb744a2"} Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.717233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"01351004-ea7d-4973-9dd2-859022a35edb","Type":"ContainerStarted","Data":"b33eca7bbd7ef710fb918f73760b72855c719ff855e7100356c5ad12cefc8dc0"} Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.727064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e7b33d-a14d-42aa-838d-158a2a4229a1","Type":"ContainerDied","Data":"ca4fb8a70014bb4a1a1fa25709752200781a17a73e76b8bcec7ff643540cbdbd"} Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.727276 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.727294 4749 scope.go:117] "RemoveContainer" containerID="9aeff0174de3e22e208a8c6418d5fbd0bb88c6f7a4980e4e32345580b8482cd6" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.735450 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.735496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67dd78ff7-qfbxb" event={"ID":"852c97ea-349d-4262-b36c-2ef7aa81ae75","Type":"ContainerStarted","Data":"fb1538a41411d8393adf1f0c90bcd1848a6d9fc8136457b097afbfd0c4663176"} Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.735520 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.750964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f63562-8b95-41ed-92c8-f7a215854065-operator-scripts\") pod \"nova-api-db-create-hqhhg\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.751134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8clf5\" (UniqueName: \"kubernetes.io/projected/16f63562-8b95-41ed-92c8-f7a215854065-kube-api-access-8clf5\") pod \"nova-api-db-create-hqhhg\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.782729 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-67dd78ff7-qfbxb" podStartSLOduration=8.782709411999999 podStartE2EDuration="8.782709412s" podCreationTimestamp="2026-03-10 16:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:09:58.757720853 +0000 UTC m=+1295.879586540" watchObservedRunningTime="2026-03-10 16:09:58.782709412 +0000 UTC m=+1295.904575099" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.811447 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-116d-account-create-update-66pnv"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.812649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.821178 4749 scope.go:117] "RemoveContainer" containerID="c9cb354e91aa7005db421f2582735683c64a50e2227e2ef383f40d218a041cfc" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.822714 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.854468 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.858234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8clf5\" (UniqueName: \"kubernetes.io/projected/16f63562-8b95-41ed-92c8-f7a215854065-kube-api-access-8clf5\") pod \"nova-api-db-create-hqhhg\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.860680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f63562-8b95-41ed-92c8-f7a215854065-operator-scripts\") pod \"nova-api-db-create-hqhhg\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.864284 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f63562-8b95-41ed-92c8-f7a215854065-operator-scripts\") pod \"nova-api-db-create-hqhhg\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.889763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8clf5\" (UniqueName: \"kubernetes.io/projected/16f63562-8b95-41ed-92c8-f7a215854065-kube-api-access-8clf5\") pod \"nova-api-db-create-hqhhg\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.898175 4749 scope.go:117] "RemoveContainer" containerID="893c341f28a27b00f8cc7d20795aa37f2e98ddba787b335a0965cbc6e649705a" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.913239 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.923335 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ct548"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.925066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.931150 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-116d-account-create-update-66pnv"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.952319 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.956798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.962483 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.962942 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.965931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs482\" (UniqueName: \"kubernetes.io/projected/1128619c-2c1a-4589-a013-34444a447036-kube-api-access-xs482\") pod \"nova-api-116d-account-create-update-66pnv\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.966119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1128619c-2c1a-4589-a013-34444a447036-operator-scripts\") pod \"nova-api-116d-account-create-update-66pnv\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.973016 4749 scope.go:117] "RemoveContainer" containerID="f843da5b161c55109383cd92b6b7fbfcfaa272d107fef445d38434f8e575b4b2" Mar 10 16:09:58 crc kubenswrapper[4749]: I0310 16:09:58.986917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ct548"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:58.999845 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.006662 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.030144 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-d8xlj"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.031337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.035658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d8xlj"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.050747 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-939f-account-create-update-9kdfk"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.057607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.060361 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.067540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.067594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.067688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-log-httpd\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.067904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f0ac736-fc91-42f3-a2af-564141a88227-operator-scripts\") pod \"nova-cell0-db-create-ct548\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.067933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvhgt\" (UniqueName: \"kubernetes.io/projected/b61b53b7-481c-4db6-9cf3-fd824848684c-kube-api-access-bvhgt\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.068275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs482\" (UniqueName: \"kubernetes.io/projected/1128619c-2c1a-4589-a013-34444a447036-kube-api-access-xs482\") pod \"nova-api-116d-account-create-update-66pnv\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.068302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-scripts\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.068336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-run-httpd\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.068949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcw8\" (UniqueName: \"kubernetes.io/projected/1f0ac736-fc91-42f3-a2af-564141a88227-kube-api-access-7gcw8\") pod \"nova-cell0-db-create-ct548\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.069043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1128619c-2c1a-4589-a013-34444a447036-operator-scripts\") pod \"nova-api-116d-account-create-update-66pnv\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.069149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-config-data\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.073889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1128619c-2c1a-4589-a013-34444a447036-operator-scripts\") pod \"nova-api-116d-account-create-update-66pnv\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.075123 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-9kdfk"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.088774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs482\" (UniqueName: \"kubernetes.io/projected/1128619c-2c1a-4589-a013-34444a447036-kube-api-access-xs482\") pod \"nova-api-116d-account-create-update-66pnv\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.145923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84f538a-6b0b-44e6-863b-bd06abf880d7-operator-scripts\") pod \"nova-cell0-939f-account-create-update-9kdfk\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-log-httpd\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f0ac736-fc91-42f3-a2af-564141a88227-operator-scripts\") pod \"nova-cell0-db-create-ct548\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvhgt\" (UniqueName: \"kubernetes.io/projected/b61b53b7-481c-4db6-9cf3-fd824848684c-kube-api-access-bvhgt\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-scripts\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172863 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-run-httpd\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcw8\" (UniqueName: \"kubernetes.io/projected/1f0ac736-fc91-42f3-a2af-564141a88227-kube-api-access-7gcw8\") pod \"nova-cell0-db-create-ct548\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.172910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqhnz\" (UniqueName: \"kubernetes.io/projected/e84f538a-6b0b-44e6-863b-bd06abf880d7-kube-api-access-wqhnz\") pod \"nova-cell0-939f-account-create-update-9kdfk\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.174142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569gb\" (UniqueName: \"kubernetes.io/projected/e3f2f368-abe2-4fe8-835e-8c60a954ab97-kube-api-access-569gb\") pod \"nova-cell1-db-create-d8xlj\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.174191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f2f368-abe2-4fe8-835e-8c60a954ab97-operator-scripts\") pod \"nova-cell1-db-create-d8xlj\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.174211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-config-data\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.177559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-log-httpd\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.178650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f0ac736-fc91-42f3-a2af-564141a88227-operator-scripts\") pod \"nova-cell0-db-create-ct548\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.178963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-run-httpd\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.179342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.186994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.187950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-config-data\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.198004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-scripts\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.198840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvhgt\" (UniqueName: \"kubernetes.io/projected/b61b53b7-481c-4db6-9cf3-fd824848684c-kube-api-access-bvhgt\") pod \"ceilometer-0\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.204144 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-sh94m"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.205236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.206316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcw8\" (UniqueName: \"kubernetes.io/projected/1f0ac736-fc91-42f3-a2af-564141a88227-kube-api-access-7gcw8\") pod \"nova-cell0-db-create-ct548\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.210227 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.255899 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-sh94m"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.284447 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.290251 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqhnz\" (UniqueName: \"kubernetes.io/projected/e84f538a-6b0b-44e6-863b-bd06abf880d7-kube-api-access-wqhnz\") pod \"nova-cell0-939f-account-create-update-9kdfk\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.290411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xt8\" (UniqueName: \"kubernetes.io/projected/46585eed-121a-4500-a26b-70bddeeeb075-kube-api-access-d6xt8\") pod \"nova-cell1-88fa-account-create-update-sh94m\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.290464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569gb\" (UniqueName: \"kubernetes.io/projected/e3f2f368-abe2-4fe8-835e-8c60a954ab97-kube-api-access-569gb\") pod \"nova-cell1-db-create-d8xlj\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.290505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46585eed-121a-4500-a26b-70bddeeeb075-operator-scripts\") pod \"nova-cell1-88fa-account-create-update-sh94m\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.290614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f2f368-abe2-4fe8-835e-8c60a954ab97-operator-scripts\") pod \"nova-cell1-db-create-d8xlj\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.290954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84f538a-6b0b-44e6-863b-bd06abf880d7-operator-scripts\") pod \"nova-cell0-939f-account-create-update-9kdfk\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.292289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84f538a-6b0b-44e6-863b-bd06abf880d7-operator-scripts\") pod \"nova-cell0-939f-account-create-update-9kdfk\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.293522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f2f368-abe2-4fe8-835e-8c60a954ab97-operator-scripts\") pod \"nova-cell1-db-create-d8xlj\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.309098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.316475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569gb\" (UniqueName: \"kubernetes.io/projected/e3f2f368-abe2-4fe8-835e-8c60a954ab97-kube-api-access-569gb\") pod \"nova-cell1-db-create-d8xlj\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.340312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqhnz\" (UniqueName: \"kubernetes.io/projected/e84f538a-6b0b-44e6-863b-bd06abf880d7-kube-api-access-wqhnz\") pod \"nova-cell0-939f-account-create-update-9kdfk\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.357147 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.381870 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.389167 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.389400 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-log" containerID="cri-o://8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd" gracePeriod=30 Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.389815 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-httpd" containerID="cri-o://c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5" gracePeriod=30 Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.402363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xt8\" (UniqueName: \"kubernetes.io/projected/46585eed-121a-4500-a26b-70bddeeeb075-kube-api-access-d6xt8\") pod \"nova-cell1-88fa-account-create-update-sh94m\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.402451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46585eed-121a-4500-a26b-70bddeeeb075-operator-scripts\") pod \"nova-cell1-88fa-account-create-update-sh94m\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.403213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46585eed-121a-4500-a26b-70bddeeeb075-operator-scripts\") pod \"nova-cell1-88fa-account-create-update-sh94m\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.451966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xt8\" (UniqueName: \"kubernetes.io/projected/46585eed-121a-4500-a26b-70bddeeeb075-kube-api-access-d6xt8\") pod \"nova-cell1-88fa-account-create-update-sh94m\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.583934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.637335 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e7b33d-a14d-42aa-838d-158a2a4229a1" path="/var/lib/kubelet/pods/57e7b33d-a14d-42aa-838d-158a2a4229a1/volumes" Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.818835 4749 generic.go:334] "Generic (PLEG): container finished" podID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerID="8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd" exitCode=143 Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.818895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f83d36e6-e860-47a3-8590-d0a468a8819a","Type":"ContainerDied","Data":"8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd"} Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.830653 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"01351004-ea7d-4973-9dd2-859022a35edb","Type":"ContainerStarted","Data":"bb17514493f3006a9700ec5156a08c7d51cbec38b230b0424cddada1c317646a"} Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.900500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-116d-account-create-update-66pnv"] Mar 10 16:09:59 crc kubenswrapper[4749]: I0310 16:09:59.913414 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hqhhg"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.158293 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552650-qd6cs"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.160160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.164028 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.165244 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.165522 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.186495 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-qd6cs"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.203150 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:00 crc kubenswrapper[4749]: W0310 16:10:00.209563 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61b53b7_481c_4db6_9cf3_fd824848684c.slice/crio-056c5cd620b27a77df9e29f90879410427b49a093834440fd3f481dc266f1560 WatchSource:0}: Error finding container 056c5cd620b27a77df9e29f90879410427b49a093834440fd3f481dc266f1560: Status 404 returned error can't find the container with id 056c5cd620b27a77df9e29f90879410427b49a093834440fd3f481dc266f1560 Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.225155 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d8xlj"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.231815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdn5\" (UniqueName: \"kubernetes.io/projected/d9b7cbb5-41b0-4439-a0b9-ce126583684c-kube-api-access-krdn5\") pod \"auto-csr-approver-29552650-qd6cs\" (UID: \"d9b7cbb5-41b0-4439-a0b9-ce126583684c\") " pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.333528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krdn5\" (UniqueName: \"kubernetes.io/projected/d9b7cbb5-41b0-4439-a0b9-ce126583684c-kube-api-access-krdn5\") pod \"auto-csr-approver-29552650-qd6cs\" (UID: \"d9b7cbb5-41b0-4439-a0b9-ce126583684c\") " pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.356374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdn5\" (UniqueName: \"kubernetes.io/projected/d9b7cbb5-41b0-4439-a0b9-ce126583684c-kube-api-access-krdn5\") pod \"auto-csr-approver-29552650-qd6cs\" (UID: \"d9b7cbb5-41b0-4439-a0b9-ce126583684c\") " pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.390701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ct548"] Mar 10 16:10:00 crc kubenswrapper[4749]: W0310 16:10:00.395652 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f0ac736_fc91_42f3_a2af_564141a88227.slice/crio-43a3520bb6cdeb249bdd84c3fcae1c86c8dce76b40370828bb7d097eef70e3f4 WatchSource:0}: Error finding container 43a3520bb6cdeb249bdd84c3fcae1c86c8dce76b40370828bb7d097eef70e3f4: Status 404 returned error can't find the container with id 43a3520bb6cdeb249bdd84c3fcae1c86c8dce76b40370828bb7d097eef70e3f4 Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.434001 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-9kdfk"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.441779 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-sh94m"] Mar 10 16:10:00 crc kubenswrapper[4749]: W0310 16:10:00.452240 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46585eed_121a_4500_a26b_70bddeeeb075.slice/crio-72c235027498d35291c75c90c319957ba757cf9aab89f738ebf80197679d25c7 WatchSource:0}: Error finding container 72c235027498d35291c75c90c319957ba757cf9aab89f738ebf80197679d25c7: Status 404 returned error can't find the container with id 72c235027498d35291c75c90c319957ba757cf9aab89f738ebf80197679d25c7 Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.491841 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.583139 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.620642 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.620878 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" containerName="kube-state-metrics" containerID="cri-o://179d78e0fc74dfc60f01357573da8d062315f456cade3c85a12fccf2e1aae2e5" gracePeriod=30 Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.841781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" event={"ID":"e84f538a-6b0b-44e6-863b-bd06abf880d7","Type":"ContainerStarted","Data":"08cd37b035ff9819f53bf0d5b89a7ddb709923ba738e95c0245cc2df49b9dc79"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.844019 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a876bab-aa64-429f-bcb8-7e644cc4f547" containerID="179d78e0fc74dfc60f01357573da8d062315f456cade3c85a12fccf2e1aae2e5" exitCode=2 Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.844082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a876bab-aa64-429f-bcb8-7e644cc4f547","Type":"ContainerDied","Data":"179d78e0fc74dfc60f01357573da8d062315f456cade3c85a12fccf2e1aae2e5"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.845235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hqhhg" event={"ID":"16f63562-8b95-41ed-92c8-f7a215854065","Type":"ContainerStarted","Data":"42bf975dd6dd1e70cf8f35b3e289f5e0be89fe07587e2e533e76c74511cca2f6"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.845257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hqhhg" event={"ID":"16f63562-8b95-41ed-92c8-f7a215854065","Type":"ContainerStarted","Data":"a58de2a1030e1413e091229fad9f8dff8d47b5d5701eec2e9336f02ca3fbcb47"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.846930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" event={"ID":"46585eed-121a-4500-a26b-70bddeeeb075","Type":"ContainerStarted","Data":"72c235027498d35291c75c90c319957ba757cf9aab89f738ebf80197679d25c7"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.848751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerStarted","Data":"056c5cd620b27a77df9e29f90879410427b49a093834440fd3f481dc266f1560"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.852065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ct548" event={"ID":"1f0ac736-fc91-42f3-a2af-564141a88227","Type":"ContainerStarted","Data":"43a3520bb6cdeb249bdd84c3fcae1c86c8dce76b40370828bb7d097eef70e3f4"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.853475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-116d-account-create-update-66pnv" event={"ID":"1128619c-2c1a-4589-a013-34444a447036","Type":"ContainerStarted","Data":"458fa37559c158012335c6ee0cb81f0c9c93e1d51747e38d0abb4f5edd8c6cba"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.854890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d8xlj" event={"ID":"e3f2f368-abe2-4fe8-835e-8c60a954ab97","Type":"ContainerStarted","Data":"957850ff5b5ff0091c4bfa9c0bf18bb19fea1538ec601d4c3b619a25cb550a88"} Mar 10 16:10:00 crc kubenswrapper[4749]: I0310 16:10:00.876760 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.876742919 podStartE2EDuration="9.876742919s" podCreationTimestamp="2026-03-10 16:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:00.87460971 +0000 UTC m=+1297.996475397" watchObservedRunningTime="2026-03-10 16:10:00.876742919 +0000 UTC m=+1297.998608606" Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.075559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-qd6cs"] Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.866306 4749 generic.go:334] "Generic (PLEG): container finished" podID="46585eed-121a-4500-a26b-70bddeeeb075" containerID="3aacb276a70bd2dd9c3f2e14a1f0784b86c6019fd8b98d5042d0b2baa9fde022" exitCode=0 Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.866368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" event={"ID":"46585eed-121a-4500-a26b-70bddeeeb075","Type":"ContainerDied","Data":"3aacb276a70bd2dd9c3f2e14a1f0784b86c6019fd8b98d5042d0b2baa9fde022"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.869119 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f0ac736-fc91-42f3-a2af-564141a88227" containerID="2f422d3b9d73bc88ff42a20aa5d8b2baa02aba46433f78fa49db5d0c304bdbd6" exitCode=0 Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.869245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ct548" event={"ID":"1f0ac736-fc91-42f3-a2af-564141a88227","Type":"ContainerDied","Data":"2f422d3b9d73bc88ff42a20aa5d8b2baa02aba46433f78fa49db5d0c304bdbd6"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.872593 4749 generic.go:334] "Generic (PLEG): container finished" podID="1128619c-2c1a-4589-a013-34444a447036" containerID="12b17b01d50b727c55fa469a834ed3400e907fc685dc1d7f44466afffff37d2b" exitCode=0 Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.872642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-116d-account-create-update-66pnv" event={"ID":"1128619c-2c1a-4589-a013-34444a447036","Type":"ContainerDied","Data":"12b17b01d50b727c55fa469a834ed3400e907fc685dc1d7f44466afffff37d2b"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.874733 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3f2f368-abe2-4fe8-835e-8c60a954ab97" containerID="d67ad0da493eea572574b3e20465891a70fb4ef286a417816a82688aa213456a" exitCode=0 Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.874826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d8xlj" event={"ID":"e3f2f368-abe2-4fe8-835e-8c60a954ab97","Type":"ContainerDied","Data":"d67ad0da493eea572574b3e20465891a70fb4ef286a417816a82688aa213456a"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.875636 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.881848 4749 generic.go:334] "Generic (PLEG): container finished" podID="e84f538a-6b0b-44e6-863b-bd06abf880d7" containerID="ca2411d2a7587ab136bf4f00c059256ccb860c1948771afc38ee76e061d6b749" exitCode=0 Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.881965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" event={"ID":"e84f538a-6b0b-44e6-863b-bd06abf880d7","Type":"ContainerDied","Data":"ca2411d2a7587ab136bf4f00c059256ccb860c1948771afc38ee76e061d6b749"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.884539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a876bab-aa64-429f-bcb8-7e644cc4f547","Type":"ContainerDied","Data":"7ff5961e1ee2648f24fae15a4bb86fec514a10c7f59404769b8fb0d09e8ee815"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.884612 4749 scope.go:117] "RemoveContainer" containerID="179d78e0fc74dfc60f01357573da8d062315f456cade3c85a12fccf2e1aae2e5" Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.884546 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.891016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" event={"ID":"d9b7cbb5-41b0-4439-a0b9-ce126583684c","Type":"ContainerStarted","Data":"c48f1d38840320eccf734b1283e6895ed5f07622dbe8c59723df7ef00e483806"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.899250 4749 generic.go:334] "Generic (PLEG): container finished" podID="16f63562-8b95-41ed-92c8-f7a215854065" containerID="42bf975dd6dd1e70cf8f35b3e289f5e0be89fe07587e2e533e76c74511cca2f6" exitCode=0 Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.899401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hqhhg" event={"ID":"16f63562-8b95-41ed-92c8-f7a215854065","Type":"ContainerDied","Data":"42bf975dd6dd1e70cf8f35b3e289f5e0be89fe07587e2e533e76c74511cca2f6"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.904993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerStarted","Data":"988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4"} Mar 10 16:10:01 crc kubenswrapper[4749]: I0310 16:10:01.991131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lchbx\" (UniqueName: \"kubernetes.io/projected/0a876bab-aa64-429f-bcb8-7e644cc4f547-kube-api-access-lchbx\") pod \"0a876bab-aa64-429f-bcb8-7e644cc4f547\" (UID: \"0a876bab-aa64-429f-bcb8-7e644cc4f547\") " Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.003521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a876bab-aa64-429f-bcb8-7e644cc4f547-kube-api-access-lchbx" (OuterVolumeSpecName: "kube-api-access-lchbx") pod "0a876bab-aa64-429f-bcb8-7e644cc4f547" (UID: "0a876bab-aa64-429f-bcb8-7e644cc4f547"). InnerVolumeSpecName "kube-api-access-lchbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.094197 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lchbx\" (UniqueName: \"kubernetes.io/projected/0a876bab-aa64-429f-bcb8-7e644cc4f547-kube-api-access-lchbx\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.226932 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.237630 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.248704 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:10:02 crc kubenswrapper[4749]: E0310 16:10:02.249145 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" containerName="kube-state-metrics" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.249163 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" containerName="kube-state-metrics" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.249341 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" containerName="kube-state-metrics" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.250294 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.253881 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.254118 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.262051 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.302575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.302640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.303063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmj4\" (UniqueName: \"kubernetes.io/projected/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-api-access-5zmj4\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.303096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.337952 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.406303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.406707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmj4\" (UniqueName: \"kubernetes.io/projected/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-api-access-5zmj4\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.406893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.407206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.417332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.418052 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.423061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.436815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmj4\" (UniqueName: \"kubernetes.io/projected/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-api-access-5zmj4\") pod \"kube-state-metrics-0\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.573173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.925020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" event={"ID":"d9b7cbb5-41b0-4439-a0b9-ce126583684c","Type":"ContainerStarted","Data":"525b3d0120628b94472bbc40c5f17df9835454c772f2b39e875c3bd06452ddd5"} Mar 10 16:10:02 crc kubenswrapper[4749]: I0310 16:10:02.946716 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" podStartSLOduration=1.6720467509999999 podStartE2EDuration="2.946693653s" podCreationTimestamp="2026-03-10 16:10:00 +0000 UTC" firstStartedPulling="2026-03-10 16:10:01.064632038 +0000 UTC m=+1298.186497725" lastFinishedPulling="2026-03-10 16:10:02.33927893 +0000 UTC m=+1299.461144627" observedRunningTime="2026-03-10 16:10:02.94514303 +0000 UTC m=+1300.067008717" watchObservedRunningTime="2026-03-10 16:10:02.946693653 +0000 UTC m=+1300.068559350" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.027681 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.499891 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.635347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs482\" (UniqueName: \"kubernetes.io/projected/1128619c-2c1a-4589-a013-34444a447036-kube-api-access-xs482\") pod \"1128619c-2c1a-4589-a013-34444a447036\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.635442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1128619c-2c1a-4589-a013-34444a447036-operator-scripts\") pod \"1128619c-2c1a-4589-a013-34444a447036\" (UID: \"1128619c-2c1a-4589-a013-34444a447036\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.636222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1128619c-2c1a-4589-a013-34444a447036-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1128619c-2c1a-4589-a013-34444a447036" (UID: "1128619c-2c1a-4589-a013-34444a447036"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.643232 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a876bab-aa64-429f-bcb8-7e644cc4f547" path="/var/lib/kubelet/pods/0a876bab-aa64-429f-bcb8-7e644cc4f547/volumes" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.657388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1128619c-2c1a-4589-a013-34444a447036-kube-api-access-xs482" (OuterVolumeSpecName: "kube-api-access-xs482") pod "1128619c-2c1a-4589-a013-34444a447036" (UID: "1128619c-2c1a-4589-a013-34444a447036"). InnerVolumeSpecName "kube-api-access-xs482". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.740522 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs482\" (UniqueName: \"kubernetes.io/projected/1128619c-2c1a-4589-a013-34444a447036-kube-api-access-xs482\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.740546 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1128619c-2c1a-4589-a013-34444a447036-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.812229 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.815227 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-log" containerID="cri-o://adf57b3b88b9c15bb9ab3a7da95e0ca84929d4e9d7da5bc5ffd73b9666f00f7e" gracePeriod=30 Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.815869 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-httpd" containerID="cri-o://7685d7f3d9ad18b7fcad8e920aa20e44c5cb9ccf8d53e0fd9f48a7461eb5386f" gracePeriod=30 Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.834863 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.867771 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.886955 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.894203 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.897229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.942366 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944231 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-569gb\" (UniqueName: \"kubernetes.io/projected/e3f2f368-abe2-4fe8-835e-8c60a954ab97-kube-api-access-569gb\") pod \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gcw8\" (UniqueName: \"kubernetes.io/projected/1f0ac736-fc91-42f3-a2af-564141a88227-kube-api-access-7gcw8\") pod \"1f0ac736-fc91-42f3-a2af-564141a88227\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f0ac736-fc91-42f3-a2af-564141a88227-operator-scripts\") pod \"1f0ac736-fc91-42f3-a2af-564141a88227\" (UID: \"1f0ac736-fc91-42f3-a2af-564141a88227\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f2f368-abe2-4fe8-835e-8c60a954ab97-operator-scripts\") pod \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\" (UID: \"e3f2f368-abe2-4fe8-835e-8c60a954ab97\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f63562-8b95-41ed-92c8-f7a215854065-operator-scripts\") pod \"16f63562-8b95-41ed-92c8-f7a215854065\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944634 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84f538a-6b0b-44e6-863b-bd06abf880d7-operator-scripts\") pod \"e84f538a-6b0b-44e6-863b-bd06abf880d7\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8clf5\" (UniqueName: \"kubernetes.io/projected/16f63562-8b95-41ed-92c8-f7a215854065-kube-api-access-8clf5\") pod \"16f63562-8b95-41ed-92c8-f7a215854065\" (UID: \"16f63562-8b95-41ed-92c8-f7a215854065\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqhnz\" (UniqueName: \"kubernetes.io/projected/e84f538a-6b0b-44e6-863b-bd06abf880d7-kube-api-access-wqhnz\") pod \"e84f538a-6b0b-44e6-863b-bd06abf880d7\" (UID: \"e84f538a-6b0b-44e6-863b-bd06abf880d7\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46585eed-121a-4500-a26b-70bddeeeb075-operator-scripts\") pod \"46585eed-121a-4500-a26b-70bddeeeb075\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.944753 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xt8\" (UniqueName: \"kubernetes.io/projected/46585eed-121a-4500-a26b-70bddeeeb075-kube-api-access-d6xt8\") pod \"46585eed-121a-4500-a26b-70bddeeeb075\" (UID: \"46585eed-121a-4500-a26b-70bddeeeb075\") " Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.947096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f63562-8b95-41ed-92c8-f7a215854065-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16f63562-8b95-41ed-92c8-f7a215854065" (UID: "16f63562-8b95-41ed-92c8-f7a215854065"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.949467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f0ac736-fc91-42f3-a2af-564141a88227-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f0ac736-fc91-42f3-a2af-564141a88227" (UID: "1f0ac736-fc91-42f3-a2af-564141a88227"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.950765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3f2f368-abe2-4fe8-835e-8c60a954ab97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3f2f368-abe2-4fe8-835e-8c60a954ab97" (UID: "e3f2f368-abe2-4fe8-835e-8c60a954ab97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.954253 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46585eed-121a-4500-a26b-70bddeeeb075-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46585eed-121a-4500-a26b-70bddeeeb075" (UID: "46585eed-121a-4500-a26b-70bddeeeb075"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.957832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f63562-8b95-41ed-92c8-f7a215854065-kube-api-access-8clf5" (OuterVolumeSpecName: "kube-api-access-8clf5") pod "16f63562-8b95-41ed-92c8-f7a215854065" (UID: "16f63562-8b95-41ed-92c8-f7a215854065"). InnerVolumeSpecName "kube-api-access-8clf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.960749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0ac736-fc91-42f3-a2af-564141a88227-kube-api-access-7gcw8" (OuterVolumeSpecName: "kube-api-access-7gcw8") pod "1f0ac736-fc91-42f3-a2af-564141a88227" (UID: "1f0ac736-fc91-42f3-a2af-564141a88227"). InnerVolumeSpecName "kube-api-access-7gcw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.961450 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e84f538a-6b0b-44e6-863b-bd06abf880d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e84f538a-6b0b-44e6-863b-bd06abf880d7" (UID: "e84f538a-6b0b-44e6-863b-bd06abf880d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.961671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84f538a-6b0b-44e6-863b-bd06abf880d7-kube-api-access-wqhnz" (OuterVolumeSpecName: "kube-api-access-wqhnz") pod "e84f538a-6b0b-44e6-863b-bd06abf880d7" (UID: "e84f538a-6b0b-44e6-863b-bd06abf880d7"). InnerVolumeSpecName "kube-api-access-wqhnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.964912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ct548" event={"ID":"1f0ac736-fc91-42f3-a2af-564141a88227","Type":"ContainerDied","Data":"43a3520bb6cdeb249bdd84c3fcae1c86c8dce76b40370828bb7d097eef70e3f4"} Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.964951 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43a3520bb6cdeb249bdd84c3fcae1c86c8dce76b40370828bb7d097eef70e3f4" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.965041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ct548" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.967230 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b7cbb5-41b0-4439-a0b9-ce126583684c" containerID="525b3d0120628b94472bbc40c5f17df9835454c772f2b39e875c3bd06452ddd5" exitCode=0 Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.967397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" event={"ID":"d9b7cbb5-41b0-4439-a0b9-ce126583684c","Type":"ContainerDied","Data":"525b3d0120628b94472bbc40c5f17df9835454c772f2b39e875c3bd06452ddd5"} Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.969871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-116d-account-create-update-66pnv" event={"ID":"1128619c-2c1a-4589-a013-34444a447036","Type":"ContainerDied","Data":"458fa37559c158012335c6ee0cb81f0c9c93e1d51747e38d0abb4f5edd8c6cba"} Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.969917 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458fa37559c158012335c6ee0cb81f0c9c93e1d51747e38d0abb4f5edd8c6cba" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.970681 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-66pnv" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.977893 4749 generic.go:334] "Generic (PLEG): container finished" podID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerID="c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5" exitCode=0 Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.978074 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.980250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46585eed-121a-4500-a26b-70bddeeeb075-kube-api-access-d6xt8" (OuterVolumeSpecName: "kube-api-access-d6xt8") pod "46585eed-121a-4500-a26b-70bddeeeb075" (UID: "46585eed-121a-4500-a26b-70bddeeeb075"). InnerVolumeSpecName "kube-api-access-d6xt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.982003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f83d36e6-e860-47a3-8590-d0a468a8819a","Type":"ContainerDied","Data":"c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5"} Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.982058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f83d36e6-e860-47a3-8590-d0a468a8819a","Type":"ContainerDied","Data":"cdce3f6ce3fde6f80be4b89d19db2a485e039cbe3dff5057d1201aad3884b880"} Mar 10 16:10:03 crc kubenswrapper[4749]: I0310 16:10:03.982077 4749 scope.go:117] "RemoveContainer" containerID="c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.009938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" event={"ID":"e84f538a-6b0b-44e6-863b-bd06abf880d7","Type":"ContainerDied","Data":"08cd37b035ff9819f53bf0d5b89a7ddb709923ba738e95c0245cc2df49b9dc79"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.009977 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08cd37b035ff9819f53bf0d5b89a7ddb709923ba738e95c0245cc2df49b9dc79" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.010117 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-9kdfk" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.017242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" event={"ID":"46585eed-121a-4500-a26b-70bddeeeb075","Type":"ContainerDied","Data":"72c235027498d35291c75c90c319957ba757cf9aab89f738ebf80197679d25c7"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.017276 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72c235027498d35291c75c90c319957ba757cf9aab89f738ebf80197679d25c7" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.017349 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-sh94m" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.032578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3f2f368-abe2-4fe8-835e-8c60a954ab97-kube-api-access-569gb" (OuterVolumeSpecName: "kube-api-access-569gb") pod "e3f2f368-abe2-4fe8-835e-8c60a954ab97" (UID: "e3f2f368-abe2-4fe8-835e-8c60a954ab97"). InnerVolumeSpecName "kube-api-access-569gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.034672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerStarted","Data":"08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.038501 4749 scope.go:117] "RemoveContainer" containerID="8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.044968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d8xlj" event={"ID":"e3f2f368-abe2-4fe8-835e-8c60a954ab97","Type":"ContainerDied","Data":"957850ff5b5ff0091c4bfa9c0bf18bb19fea1538ec601d4c3b619a25cb550a88"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.045009 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="957850ff5b5ff0091c4bfa9c0bf18bb19fea1538ec601d4c3b619a25cb550a88" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.045083 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d8xlj" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-public-tls-certs\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-combined-ca-bundle\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-logs\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-scripts\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-httpd-run\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.047644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvrhn\" (UniqueName: \"kubernetes.io/projected/f83d36e6-e860-47a3-8590-d0a468a8819a-kube-api-access-bvrhn\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048242 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gcw8\" (UniqueName: \"kubernetes.io/projected/1f0ac736-fc91-42f3-a2af-564141a88227-kube-api-access-7gcw8\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048269 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f0ac736-fc91-42f3-a2af-564141a88227-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048280 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f2f368-abe2-4fe8-835e-8c60a954ab97-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048292 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16f63562-8b95-41ed-92c8-f7a215854065-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048303 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e84f538a-6b0b-44e6-863b-bd06abf880d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048316 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8clf5\" (UniqueName: \"kubernetes.io/projected/16f63562-8b95-41ed-92c8-f7a215854065-kube-api-access-8clf5\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048328 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqhnz\" (UniqueName: \"kubernetes.io/projected/e84f538a-6b0b-44e6-863b-bd06abf880d7-kube-api-access-wqhnz\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048339 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46585eed-121a-4500-a26b-70bddeeeb075-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048351 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xt8\" (UniqueName: \"kubernetes.io/projected/46585eed-121a-4500-a26b-70bddeeeb075-kube-api-access-d6xt8\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.048364 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-569gb\" (UniqueName: \"kubernetes.io/projected/e3f2f368-abe2-4fe8-835e-8c60a954ab97-kube-api-access-569gb\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.053461 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83d36e6-e860-47a3-8590-d0a468a8819a-kube-api-access-bvrhn" (OuterVolumeSpecName: "kube-api-access-bvrhn") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "kube-api-access-bvrhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.053734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-logs" (OuterVolumeSpecName: "logs") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.056501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.056746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a8246b1-28b8-4eb6-83a3-1e87beecfb78","Type":"ContainerStarted","Data":"fc7c26df644965a11d66d3971b0f14d486ea00dfbfe753e9e176059c55e661ad"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.056814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a8246b1-28b8-4eb6-83a3-1e87beecfb78","Type":"ContainerStarted","Data":"5664a16b9bca18b3a86d87785a9e86c27c72f0ed8bbe7da6cc5d33b19f9a0194"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.057134 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.059043 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-scripts" (OuterVolumeSpecName: "scripts") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.061797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.062007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hqhhg" event={"ID":"16f63562-8b95-41ed-92c8-f7a215854065","Type":"ContainerDied","Data":"a58de2a1030e1413e091229fad9f8dff8d47b5d5701eec2e9336f02ca3fbcb47"} Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.062056 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58de2a1030e1413e091229fad9f8dff8d47b5d5701eec2e9336f02ca3fbcb47" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.071339 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hqhhg" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.089295 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.611612419 podStartE2EDuration="2.089277215s" podCreationTimestamp="2026-03-10 16:10:02 +0000 UTC" firstStartedPulling="2026-03-10 16:10:03.102355753 +0000 UTC m=+1300.224221440" lastFinishedPulling="2026-03-10 16:10:03.580020549 +0000 UTC m=+1300.701886236" observedRunningTime="2026-03-10 16:10:04.082834578 +0000 UTC m=+1301.204700265" watchObservedRunningTime="2026-03-10 16:10:04.089277215 +0000 UTC m=+1301.211142902" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.115419 4749 scope.go:117] "RemoveContainer" containerID="c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.120677 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5\": container with ID starting with c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5 not found: ID does not exist" containerID="c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.120732 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5"} err="failed to get container status \"c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5\": rpc error: code = NotFound desc = could not find container \"c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5\": container with ID starting with c556b2bfacae6027405b4d2e2aa2707c6cb88193c5f1a7d8fa814be6d3f556d5 not found: ID does not exist" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.121192 4749 scope.go:117] "RemoveContainer" containerID="8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.122903 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd\": container with ID starting with 8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd not found: ID does not exist" containerID="8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.122950 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd"} err="failed to get container status \"8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd\": rpc error: code = NotFound desc = could not find container \"8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd\": container with ID starting with 8fbe1339ff08841cc1604985c8269450f32793172da5b912329dedd533f298cd not found: ID does not exist" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.130505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.148753 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data" (OuterVolumeSpecName: "config-data") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.154835 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data\") pod \"f83d36e6-e860-47a3-8590-d0a468a8819a\" (UID: \"f83d36e6-e860-47a3-8590-d0a468a8819a\") " Mar 10 16:10:04 crc kubenswrapper[4749]: W0310 16:10:04.155280 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f83d36e6-e860-47a3-8590-d0a468a8819a/volumes/kubernetes.io~secret/config-data Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.155301 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data" (OuterVolumeSpecName: "config-data") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.156923 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.157091 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.157112 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.157124 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f83d36e6-e860-47a3-8590-d0a468a8819a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.157137 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvrhn\" (UniqueName: \"kubernetes.io/projected/f83d36e6-e860-47a3-8590-d0a468a8819a-kube-api-access-bvrhn\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.157152 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.157174 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.163867 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f83d36e6-e860-47a3-8590-d0a468a8819a" (UID: "f83d36e6-e860-47a3-8590-d0a468a8819a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.179397 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.259367 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f83d36e6-e860-47a3-8590-d0a468a8819a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.259431 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.325426 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.334024 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.355715 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356096 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f2f368-abe2-4fe8-835e-8c60a954ab97" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356113 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f2f368-abe2-4fe8-835e-8c60a954ab97" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356140 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-httpd" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356146 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-httpd" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356159 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84f538a-6b0b-44e6-863b-bd06abf880d7" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356165 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84f538a-6b0b-44e6-863b-bd06abf880d7" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356178 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1128619c-2c1a-4589-a013-34444a447036" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1128619c-2c1a-4589-a013-34444a447036" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356197 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0ac736-fc91-42f3-a2af-564141a88227" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356202 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0ac736-fc91-42f3-a2af-564141a88227" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356215 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46585eed-121a-4500-a26b-70bddeeeb075" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356221 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="46585eed-121a-4500-a26b-70bddeeeb075" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356231 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-log" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356237 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-log" Mar 10 16:10:04 crc kubenswrapper[4749]: E0310 16:10:04.356244 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f63562-8b95-41ed-92c8-f7a215854065" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356250 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f63562-8b95-41ed-92c8-f7a215854065" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356481 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3f2f368-abe2-4fe8-835e-8c60a954ab97" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356509 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="46585eed-121a-4500-a26b-70bddeeeb075" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356519 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-log" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356529 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" containerName="glance-httpd" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356538 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0ac736-fc91-42f3-a2af-564141a88227" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356547 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84f538a-6b0b-44e6-863b-bd06abf880d7" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356557 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f63562-8b95-41ed-92c8-f7a215854065" containerName="mariadb-database-create" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.356565 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1128619c-2c1a-4589-a013-34444a447036" containerName="mariadb-account-create-update" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.357483 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.360279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.360537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.366936 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.463324 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-config-data\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.463872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.463957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.463989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.464029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-scripts\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.464257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.464359 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-logs\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.464517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxd9v\" (UniqueName: \"kubernetes.io/projected/15480433-b4c2-47c5-a7e4-73395b5bd27d-kube-api-access-wxd9v\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-scripts\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-logs\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxd9v\" (UniqueName: \"kubernetes.io/projected/15480433-b4c2-47c5-a7e4-73395b5bd27d-kube-api-access-wxd9v\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-config-data\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.566767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.567411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.568822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-logs\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.569068 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.573128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-scripts\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.573206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.574187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-config-data\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.578105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.591847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxd9v\" (UniqueName: \"kubernetes.io/projected/15480433-b4c2-47c5-a7e4-73395b5bd27d-kube-api-access-wxd9v\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.601802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " pod="openstack/glance-default-external-api-0" Mar 10 16:10:04 crc kubenswrapper[4749]: I0310 16:10:04.685487 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.073779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerStarted","Data":"f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0"} Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.076188 4749 generic.go:334] "Generic (PLEG): container finished" podID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerID="adf57b3b88b9c15bb9ab3a7da95e0ca84929d4e9d7da5bc5ffd73b9666f00f7e" exitCode=143 Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.076339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87889224-54c2-4883-97e8-20e5ad3a8f8b","Type":"ContainerDied","Data":"adf57b3b88b9c15bb9ab3a7da95e0ca84929d4e9d7da5bc5ffd73b9666f00f7e"} Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.270411 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.367229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.499470 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krdn5\" (UniqueName: \"kubernetes.io/projected/d9b7cbb5-41b0-4439-a0b9-ce126583684c-kube-api-access-krdn5\") pod \"d9b7cbb5-41b0-4439-a0b9-ce126583684c\" (UID: \"d9b7cbb5-41b0-4439-a0b9-ce126583684c\") " Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.510712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b7cbb5-41b0-4439-a0b9-ce126583684c-kube-api-access-krdn5" (OuterVolumeSpecName: "kube-api-access-krdn5") pod "d9b7cbb5-41b0-4439-a0b9-ce126583684c" (UID: "d9b7cbb5-41b0-4439-a0b9-ce126583684c"). InnerVolumeSpecName "kube-api-access-krdn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.605511 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krdn5\" (UniqueName: \"kubernetes.io/projected/d9b7cbb5-41b0-4439-a0b9-ce126583684c-kube-api-access-krdn5\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.616395 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83d36e6-e860-47a3-8590-d0a468a8819a" path="/var/lib/kubelet/pods/f83d36e6-e860-47a3-8590-d0a468a8819a/volumes" Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.743311 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.824350 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d596895b8-zh48t"] Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.827239 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d596895b8-zh48t" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-api" containerID="cri-o://a84826754d7077e0e9d7942da7d6c3211342df1be5029fe9d914ac549a6f2958" gracePeriod=30 Mar 10 16:10:05 crc kubenswrapper[4749]: I0310 16:10:05.827388 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d596895b8-zh48t" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-httpd" containerID="cri-o://47d39ee2038a7c20b81e06ce8eb1f8cc68d20deef858b1d0a015b6fa325c1429" gracePeriod=30 Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.002084 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.020002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.027602 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-9h9tm"] Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.071822 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-9h9tm"] Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.093922 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.094673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-qd6cs" event={"ID":"d9b7cbb5-41b0-4439-a0b9-ce126583684c","Type":"ContainerDied","Data":"c48f1d38840320eccf734b1283e6895ed5f07622dbe8c59723df7ef00e483806"} Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.094697 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c48f1d38840320eccf734b1283e6895ed5f07622dbe8c59723df7ef00e483806" Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.100167 4749 generic.go:334] "Generic (PLEG): container finished" podID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerID="47d39ee2038a7c20b81e06ce8eb1f8cc68d20deef858b1d0a015b6fa325c1429" exitCode=0 Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.100236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d596895b8-zh48t" event={"ID":"0893ff76-efa3-496c-b499-0f6e3a4ffd59","Type":"ContainerDied","Data":"47d39ee2038a7c20b81e06ce8eb1f8cc68d20deef858b1d0a015b6fa325c1429"} Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.103089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15480433-b4c2-47c5-a7e4-73395b5bd27d","Type":"ContainerStarted","Data":"c88c00c58bcfe5242271bb002d37c1a1a9cd3e5dd3b4b9465326ba4e737970b2"} Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.103117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15480433-b4c2-47c5-a7e4-73395b5bd27d","Type":"ContainerStarted","Data":"155ef0556de8c3381936aa3a148a225cfea9fc1e296109ceebf18e89c3c6a91b"} Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.988439 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:52890->10.217.0.156:9292: read: connection reset by peer" Mar 10 16:10:06 crc kubenswrapper[4749]: I0310 16:10:06.988472 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:52888->10.217.0.156:9292: read: connection reset by peer" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.114429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15480433-b4c2-47c5-a7e4-73395b5bd27d","Type":"ContainerStarted","Data":"236964c999aec36cddb5fe0239f2b923e3a235f3f2a6498c0a9402202498207b"} Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.118697 4749 generic.go:334] "Generic (PLEG): container finished" podID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerID="7685d7f3d9ad18b7fcad8e920aa20e44c5cb9ccf8d53e0fd9f48a7461eb5386f" exitCode=0 Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.118797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87889224-54c2-4883-97e8-20e5ad3a8f8b","Type":"ContainerDied","Data":"7685d7f3d9ad18b7fcad8e920aa20e44c5cb9ccf8d53e0fd9f48a7461eb5386f"} Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.130998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerStarted","Data":"c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60"} Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.131438 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.131251 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-notification-agent" containerID="cri-o://08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd" gracePeriod=30 Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.131278 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="proxy-httpd" containerID="cri-o://c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60" gracePeriod=30 Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.131278 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="sg-core" containerID="cri-o://f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0" gracePeriod=30 Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.131185 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-central-agent" containerID="cri-o://988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4" gracePeriod=30 Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.150311 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.150288096 podStartE2EDuration="3.150288096s" podCreationTimestamp="2026-03-10 16:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:07.143997652 +0000 UTC m=+1304.265863339" watchObservedRunningTime="2026-03-10 16:10:07.150288096 +0000 UTC m=+1304.272153783" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.177924 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.719132084 podStartE2EDuration="9.177898856s" podCreationTimestamp="2026-03-10 16:09:58 +0000 UTC" firstStartedPulling="2026-03-10 16:10:00.212169832 +0000 UTC m=+1297.334035519" lastFinishedPulling="2026-03-10 16:10:06.670936604 +0000 UTC m=+1303.792802291" observedRunningTime="2026-03-10 16:10:07.165726611 +0000 UTC m=+1304.287592298" watchObservedRunningTime="2026-03-10 16:10:07.177898856 +0000 UTC m=+1304.299764543" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.505944 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.558894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.558945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-logs\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.558999 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-combined-ca-bundle\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.559070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8pln\" (UniqueName: \"kubernetes.io/projected/87889224-54c2-4883-97e8-20e5ad3a8f8b-kube-api-access-d8pln\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.559085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-config-data\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.559143 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-internal-tls-certs\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.559162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-scripts\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.559240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-httpd-run\") pod \"87889224-54c2-4883-97e8-20e5ad3a8f8b\" (UID: \"87889224-54c2-4883-97e8-20e5ad3a8f8b\") " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.562471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.564712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-logs" (OuterVolumeSpecName: "logs") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.575293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87889224-54c2-4883-97e8-20e5ad3a8f8b-kube-api-access-d8pln" (OuterVolumeSpecName: "kube-api-access-d8pln") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "kube-api-access-d8pln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.582607 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.612627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-scripts" (OuterVolumeSpecName: "scripts") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.618489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.655263 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275c8c2e-026a-4773-b021-58644290e646" path="/var/lib/kubelet/pods/275c8c2e-026a-4773-b021-58644290e646/volumes" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.665823 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.665858 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.665869 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.665878 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8pln\" (UniqueName: \"kubernetes.io/projected/87889224-54c2-4883-97e8-20e5ad3a8f8b-kube-api-access-d8pln\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.665888 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.665896 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87889224-54c2-4883-97e8-20e5ad3a8f8b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.698903 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.739872 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.745575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-config-data" (OuterVolumeSpecName: "config-data") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.752251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "87889224-54c2-4883-97e8-20e5ad3a8f8b" (UID: "87889224-54c2-4883-97e8-20e5ad3a8f8b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.768535 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.768575 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87889224-54c2-4883-97e8-20e5ad3a8f8b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:07 crc kubenswrapper[4749]: I0310 16:10:07.768589 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.143677 4749 generic.go:334] "Generic (PLEG): container finished" podID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerID="c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60" exitCode=0 Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.143708 4749 generic.go:334] "Generic (PLEG): container finished" podID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerID="f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0" exitCode=2 Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.143715 4749 generic.go:334] "Generic (PLEG): container finished" podID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerID="08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd" exitCode=0 Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.143748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerDied","Data":"c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60"} Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.143772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerDied","Data":"f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0"} Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.143781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerDied","Data":"08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd"} Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.146453 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.147749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87889224-54c2-4883-97e8-20e5ad3a8f8b","Type":"ContainerDied","Data":"bda6a20356cb4a8d7076ab876c23ba60c794e84bf8e69809f88a06da62151e03"} Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.147812 4749 scope.go:117] "RemoveContainer" containerID="7685d7f3d9ad18b7fcad8e920aa20e44c5cb9ccf8d53e0fd9f48a7461eb5386f" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.180218 4749 scope.go:117] "RemoveContainer" containerID="adf57b3b88b9c15bb9ab3a7da95e0ca84929d4e9d7da5bc5ffd73b9666f00f7e" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.191822 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.208253 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.226194 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:10:08 crc kubenswrapper[4749]: E0310 16:10:08.226760 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-httpd" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.226782 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-httpd" Mar 10 16:10:08 crc kubenswrapper[4749]: E0310 16:10:08.226813 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-log" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.226822 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-log" Mar 10 16:10:08 crc kubenswrapper[4749]: E0310 16:10:08.226838 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b7cbb5-41b0-4439-a0b9-ce126583684c" containerName="oc" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.226845 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b7cbb5-41b0-4439-a0b9-ce126583684c" containerName="oc" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.227045 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-log" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.227072 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" containerName="glance-httpd" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.227085 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b7cbb5-41b0-4439-a0b9-ce126583684c" containerName="oc" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.228229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.233463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.233739 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.239036 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.378915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.379076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.379228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55gw\" (UniqueName: \"kubernetes.io/projected/1b598099-b3f7-4157-8e5f-6eb472806511-kube-api-access-q55gw\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.379980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.380047 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.380097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.380165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.380209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-logs\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-logs\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482580 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q55gw\" (UniqueName: \"kubernetes.io/projected/1b598099-b3f7-4157-8e5f-6eb472806511-kube-api-access-q55gw\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.482856 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.483087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.483515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-logs\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.486906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.487350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.487355 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.499542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.509067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q55gw\" (UniqueName: \"kubernetes.io/projected/1b598099-b3f7-4157-8e5f-6eb472806511-kube-api-access-q55gw\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.515116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " pod="openstack/glance-default-internal-api-0" Mar 10 16:10:08 crc kubenswrapper[4749]: I0310 16:10:08.545862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.421413 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6g4w"] Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.422737 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.424941 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.425718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qlxkb" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.428976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.457679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.478772 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6g4w"] Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.606364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-config-data\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.606606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.606876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkfs\" (UniqueName: \"kubernetes.io/projected/9945ae2b-1140-4eb6-8212-c56f874dc891-kube-api-access-wwkfs\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.606958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-scripts\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.622825 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87889224-54c2-4883-97e8-20e5ad3a8f8b" path="/var/lib/kubelet/pods/87889224-54c2-4883-97e8-20e5ad3a8f8b/volumes" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.709086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.709186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkfs\" (UniqueName: \"kubernetes.io/projected/9945ae2b-1140-4eb6-8212-c56f874dc891-kube-api-access-wwkfs\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.709229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-scripts\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.709288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-config-data\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.723229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.725994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-scripts\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.730037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkfs\" (UniqueName: \"kubernetes.io/projected/9945ae2b-1140-4eb6-8212-c56f874dc891-kube-api-access-wwkfs\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.741657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-config-data\") pod \"nova-cell0-conductor-db-sync-c6g4w\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.764108 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:09 crc kubenswrapper[4749]: I0310 16:10:09.848712 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.013476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-config-data\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.013877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-combined-ca-bundle\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.013926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-sg-core-conf-yaml\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.013942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-scripts\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.013961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvhgt\" (UniqueName: \"kubernetes.io/projected/b61b53b7-481c-4db6-9cf3-fd824848684c-kube-api-access-bvhgt\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.013982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-log-httpd\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.014037 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-run-httpd\") pod \"b61b53b7-481c-4db6-9cf3-fd824848684c\" (UID: \"b61b53b7-481c-4db6-9cf3-fd824848684c\") " Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.014662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.015996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.018246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-scripts" (OuterVolumeSpecName: "scripts") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.019794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61b53b7-481c-4db6-9cf3-fd824848684c-kube-api-access-bvhgt" (OuterVolumeSpecName: "kube-api-access-bvhgt") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "kube-api-access-bvhgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.048350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.115855 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.115886 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.115897 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvhgt\" (UniqueName: \"kubernetes.io/projected/b61b53b7-481c-4db6-9cf3-fd824848684c-kube-api-access-bvhgt\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.115906 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.115930 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b61b53b7-481c-4db6-9cf3-fd824848684c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.123137 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.140671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-config-data" (OuterVolumeSpecName: "config-data") pod "b61b53b7-481c-4db6-9cf3-fd824848684c" (UID: "b61b53b7-481c-4db6-9cf3-fd824848684c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.174477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1b598099-b3f7-4157-8e5f-6eb472806511","Type":"ContainerStarted","Data":"cab2c2597fe3eedc75127b4143e5f1b6bbdc90ba2c1b1f74f9e373f1b0ed0f17"} Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.174519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1b598099-b3f7-4157-8e5f-6eb472806511","Type":"ContainerStarted","Data":"0bae8e0bcabe94ac8952147ff854cb9eafce4908323d9aad1306b5064a2e57e8"} Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.181703 4749 generic.go:334] "Generic (PLEG): container finished" podID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerID="988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4" exitCode=0 Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.181740 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.181760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerDied","Data":"988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4"} Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.181800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b61b53b7-481c-4db6-9cf3-fd824848684c","Type":"ContainerDied","Data":"056c5cd620b27a77df9e29f90879410427b49a093834440fd3f481dc266f1560"} Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.181819 4749 scope.go:117] "RemoveContainer" containerID="c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.217754 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.217793 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b61b53b7-481c-4db6-9cf3-fd824848684c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.226200 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.230828 4749 scope.go:117] "RemoveContainer" containerID="f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.246631 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.273156 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.273760 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="proxy-httpd" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.273784 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="proxy-httpd" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.273807 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-notification-agent" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.273815 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-notification-agent" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.273849 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="sg-core" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.273856 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="sg-core" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.273882 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-central-agent" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.273888 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-central-agent" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.274178 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="sg-core" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.274211 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-central-agent" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.274228 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="ceilometer-notification-agent" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.274245 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" containerName="proxy-httpd" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.277471 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.281987 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.282554 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.285745 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.290801 4749 scope.go:117] "RemoveContainer" containerID="08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.298585 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.327197 4749 scope.go:117] "RemoveContainer" containerID="988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.366221 4749 scope.go:117] "RemoveContainer" containerID="c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.366866 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60\": container with ID starting with c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60 not found: ID does not exist" containerID="c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.366898 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60"} err="failed to get container status \"c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60\": rpc error: code = NotFound desc = could not find container \"c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60\": container with ID starting with c8b9bd52a663644011966a5099dabeecfeee65a2c7dde2150c0d24d913aebf60 not found: ID does not exist" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.366922 4749 scope.go:117] "RemoveContainer" containerID="f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.367344 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0\": container with ID starting with f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0 not found: ID does not exist" containerID="f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.367448 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0"} err="failed to get container status \"f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0\": rpc error: code = NotFound desc = could not find container \"f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0\": container with ID starting with f76e09d1b1b17d297ed8a212aa28a1ec8f250f800f4a65d7f50a5698152a8cc0 not found: ID does not exist" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.367470 4749 scope.go:117] "RemoveContainer" containerID="08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.367881 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd\": container with ID starting with 08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd not found: ID does not exist" containerID="08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.367907 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd"} err="failed to get container status \"08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd\": rpc error: code = NotFound desc = could not find container \"08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd\": container with ID starting with 08f66bfe75b4ccbef6dad8e9ad3817528d1ea2726f6b129318a6321118f5d6fd not found: ID does not exist" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.367924 4749 scope.go:117] "RemoveContainer" containerID="988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4" Mar 10 16:10:10 crc kubenswrapper[4749]: E0310 16:10:10.368282 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4\": container with ID starting with 988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4 not found: ID does not exist" containerID="988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.368311 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4"} err="failed to get container status \"988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4\": rpc error: code = NotFound desc = could not find container \"988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4\": container with ID starting with 988657a47e0c4a66a76e1436fcd2954746b2fe0818b289c493f220bd639481c4 not found: ID does not exist" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.374513 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6g4w"] Mar 10 16:10:10 crc kubenswrapper[4749]: W0310 16:10:10.379525 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9945ae2b_1140_4eb6_8212_c56f874dc891.slice/crio-fecc09feefe2007c543532eed95154c998fb29a299d0a194c55cf797a397d514 WatchSource:0}: Error finding container fecc09feefe2007c543532eed95154c998fb29a299d0a194c55cf797a397d514: Status 404 returned error can't find the container with id fecc09feefe2007c543532eed95154c998fb29a299d0a194c55cf797a397d514 Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-scripts\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-run-httpd\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twnh2\" (UniqueName: \"kubernetes.io/projected/6165f576-f6e2-4cb6-a577-9d764e6880b1-kube-api-access-twnh2\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-config-data\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-log-httpd\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.423316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-scripts\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-run-httpd\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twnh2\" (UniqueName: \"kubernetes.io/projected/6165f576-f6e2-4cb6-a577-9d764e6880b1-kube-api-access-twnh2\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-config-data\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-log-httpd\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.525974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.526982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-log-httpd\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.531475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-config-data\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.531814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.534177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-run-httpd\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.537964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.539304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-scripts\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.543538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twnh2\" (UniqueName: \"kubernetes.io/projected/6165f576-f6e2-4cb6-a577-9d764e6880b1-kube-api-access-twnh2\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.549816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " pod="openstack/ceilometer-0" Mar 10 16:10:10 crc kubenswrapper[4749]: I0310 16:10:10.627644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.134650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:11 crc kubenswrapper[4749]: W0310 16:10:11.145837 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6165f576_f6e2_4cb6_a577_9d764e6880b1.slice/crio-255eb440ed4d1cd05e7dd2dbdd4e31d7724e2991a2bcff8a5beec226e18efb8c WatchSource:0}: Error finding container 255eb440ed4d1cd05e7dd2dbdd4e31d7724e2991a2bcff8a5beec226e18efb8c: Status 404 returned error can't find the container with id 255eb440ed4d1cd05e7dd2dbdd4e31d7724e2991a2bcff8a5beec226e18efb8c Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.193774 4749 generic.go:334] "Generic (PLEG): container finished" podID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerID="a84826754d7077e0e9d7942da7d6c3211342df1be5029fe9d914ac549a6f2958" exitCode=0 Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.193829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d596895b8-zh48t" event={"ID":"0893ff76-efa3-496c-b499-0f6e3a4ffd59","Type":"ContainerDied","Data":"a84826754d7077e0e9d7942da7d6c3211342df1be5029fe9d914ac549a6f2958"} Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.195977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1b598099-b3f7-4157-8e5f-6eb472806511","Type":"ContainerStarted","Data":"5d60d1aa5d24cbc57ff5075376de396d004cbb8a0f8a549e929e2a81a8d75bd4"} Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.200416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerStarted","Data":"255eb440ed4d1cd05e7dd2dbdd4e31d7724e2991a2bcff8a5beec226e18efb8c"} Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.207528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" event={"ID":"9945ae2b-1140-4eb6-8212-c56f874dc891","Type":"ContainerStarted","Data":"fecc09feefe2007c543532eed95154c998fb29a299d0a194c55cf797a397d514"} Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.224857 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.224836321 podStartE2EDuration="3.224836321s" podCreationTimestamp="2026-03-10 16:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:11.215192076 +0000 UTC m=+1308.337057753" watchObservedRunningTime="2026-03-10 16:10:11.224836321 +0000 UTC m=+1308.346702008" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.416063 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.554431 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-ovndb-tls-certs\") pod \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.554723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-combined-ca-bundle\") pod \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.554788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7zbm\" (UniqueName: \"kubernetes.io/projected/0893ff76-efa3-496c-b499-0f6e3a4ffd59-kube-api-access-k7zbm\") pod \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.554827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-config\") pod \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.554865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-httpd-config\") pod \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\" (UID: \"0893ff76-efa3-496c-b499-0f6e3a4ffd59\") " Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.566430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0893ff76-efa3-496c-b499-0f6e3a4ffd59" (UID: "0893ff76-efa3-496c-b499-0f6e3a4ffd59"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.566486 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0893ff76-efa3-496c-b499-0f6e3a4ffd59-kube-api-access-k7zbm" (OuterVolumeSpecName: "kube-api-access-k7zbm") pod "0893ff76-efa3-496c-b499-0f6e3a4ffd59" (UID: "0893ff76-efa3-496c-b499-0f6e3a4ffd59"). InnerVolumeSpecName "kube-api-access-k7zbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.619584 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61b53b7-481c-4db6-9cf3-fd824848684c" path="/var/lib/kubelet/pods/b61b53b7-481c-4db6-9cf3-fd824848684c/volumes" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.626484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-config" (OuterVolumeSpecName: "config") pod "0893ff76-efa3-496c-b499-0f6e3a4ffd59" (UID: "0893ff76-efa3-496c-b499-0f6e3a4ffd59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.655986 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0893ff76-efa3-496c-b499-0f6e3a4ffd59" (UID: "0893ff76-efa3-496c-b499-0f6e3a4ffd59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.657540 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.657563 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7zbm\" (UniqueName: \"kubernetes.io/projected/0893ff76-efa3-496c-b499-0f6e3a4ffd59-kube-api-access-k7zbm\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.657574 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.657585 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.667758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0893ff76-efa3-496c-b499-0f6e3a4ffd59" (UID: "0893ff76-efa3-496c-b499-0f6e3a4ffd59"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:11 crc kubenswrapper[4749]: I0310 16:10:11.759659 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0893ff76-efa3-496c-b499-0f6e3a4ffd59-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.221725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d596895b8-zh48t" event={"ID":"0893ff76-efa3-496c-b499-0f6e3a4ffd59","Type":"ContainerDied","Data":"c0a96ede134ecf3efecb01f0830126d4278b5249231e5ab86fe4c06e26a90648"} Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.221804 4749 scope.go:117] "RemoveContainer" containerID="47d39ee2038a7c20b81e06ce8eb1f8cc68d20deef858b1d0a015b6fa325c1429" Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.221746 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d596895b8-zh48t" Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.259030 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d596895b8-zh48t"] Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.268718 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7d596895b8-zh48t"] Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.273790 4749 scope.go:117] "RemoveContainer" containerID="a84826754d7077e0e9d7942da7d6c3211342df1be5029fe9d914ac549a6f2958" Mar 10 16:10:12 crc kubenswrapper[4749]: I0310 16:10:12.588603 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.327493 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.453067 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.514811 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d5d974b56-qhhk8"] Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.515099 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d5d974b56-qhhk8" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-log" containerID="cri-o://6821906d92b9f98d07a757898096aceb546ffc4e4e1a632f5c9911d18b877b56" gracePeriod=30 Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.515566 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d5d974b56-qhhk8" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-api" containerID="cri-o://8a3c6f2258745be6f39810ceb82863bebc0e919f31b8df07a4f0c7716b28f461" gracePeriod=30 Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.626217 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" path="/var/lib/kubelet/pods/0893ff76-efa3-496c-b499-0f6e3a4ffd59/volumes" Mar 10 16:10:13 crc kubenswrapper[4749]: I0310 16:10:13.936637 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:14 crc kubenswrapper[4749]: I0310 16:10:14.249723 4749 generic.go:334] "Generic (PLEG): container finished" podID="cc930999-5118-4423-a996-c11f390919f2" containerID="6821906d92b9f98d07a757898096aceb546ffc4e4e1a632f5c9911d18b877b56" exitCode=143 Mar 10 16:10:14 crc kubenswrapper[4749]: I0310 16:10:14.249805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5d974b56-qhhk8" event={"ID":"cc930999-5118-4423-a996-c11f390919f2","Type":"ContainerDied","Data":"6821906d92b9f98d07a757898096aceb546ffc4e4e1a632f5c9911d18b877b56"} Mar 10 16:10:14 crc kubenswrapper[4749]: I0310 16:10:14.687205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 16:10:14 crc kubenswrapper[4749]: I0310 16:10:14.687251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 16:10:14 crc kubenswrapper[4749]: I0310 16:10:14.735742 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 16:10:14 crc kubenswrapper[4749]: I0310 16:10:14.744143 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 16:10:15 crc kubenswrapper[4749]: I0310 16:10:15.262965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerStarted","Data":"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531"} Mar 10 16:10:15 crc kubenswrapper[4749]: I0310 16:10:15.263361 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 16:10:15 crc kubenswrapper[4749]: I0310 16:10:15.263415 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 16:10:16 crc kubenswrapper[4749]: I0310 16:10:16.788559 4749 scope.go:117] "RemoveContainer" containerID="a38bdddb79d11597cebedbefa552e2cec753c79dee4ea9ffef4c2aee1e2733b9" Mar 10 16:10:17 crc kubenswrapper[4749]: I0310 16:10:17.302239 4749 generic.go:334] "Generic (PLEG): container finished" podID="cc930999-5118-4423-a996-c11f390919f2" containerID="8a3c6f2258745be6f39810ceb82863bebc0e919f31b8df07a4f0c7716b28f461" exitCode=0 Mar 10 16:10:17 crc kubenswrapper[4749]: I0310 16:10:17.302957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5d974b56-qhhk8" event={"ID":"cc930999-5118-4423-a996-c11f390919f2","Type":"ContainerDied","Data":"8a3c6f2258745be6f39810ceb82863bebc0e919f31b8df07a4f0c7716b28f461"} Mar 10 16:10:17 crc kubenswrapper[4749]: I0310 16:10:17.304485 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 16:10:17 crc kubenswrapper[4749]: I0310 16:10:17.304517 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 16:10:17 crc kubenswrapper[4749]: I0310 16:10:17.433243 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 16:10:17 crc kubenswrapper[4749]: I0310 16:10:17.445151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 16:10:18 crc kubenswrapper[4749]: I0310 16:10:18.546518 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:18 crc kubenswrapper[4749]: I0310 16:10:18.547257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:18 crc kubenswrapper[4749]: I0310 16:10:18.589132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:18 crc kubenswrapper[4749]: I0310 16:10:18.600726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:19 crc kubenswrapper[4749]: I0310 16:10:19.321736 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:19 crc kubenswrapper[4749]: I0310 16:10:19.322078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.331956 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.342700 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d5d974b56-qhhk8" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.343165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5d974b56-qhhk8" event={"ID":"cc930999-5118-4423-a996-c11f390919f2","Type":"ContainerDied","Data":"3e009ff08a655289306604f52acf85efa7bdb3fd98be73ec44893e0e1214a45f"} Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.343198 4749 scope.go:117] "RemoveContainer" containerID="8a3c6f2258745be6f39810ceb82863bebc0e919f31b8df07a4f0c7716b28f461" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.422046 4749 scope.go:117] "RemoveContainer" containerID="6821906d92b9f98d07a757898096aceb546ffc4e4e1a632f5c9911d18b877b56" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-public-tls-certs\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-scripts\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454414 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-internal-tls-certs\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc930999-5118-4423-a996-c11f390919f2-logs\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454541 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrns9\" (UniqueName: \"kubernetes.io/projected/cc930999-5118-4423-a996-c11f390919f2-kube-api-access-mrns9\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-config-data\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.454844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-combined-ca-bundle\") pod \"cc930999-5118-4423-a996-c11f390919f2\" (UID: \"cc930999-5118-4423-a996-c11f390919f2\") " Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.455200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc930999-5118-4423-a996-c11f390919f2-logs" (OuterVolumeSpecName: "logs") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.455708 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc930999-5118-4423-a996-c11f390919f2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.461472 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc930999-5118-4423-a996-c11f390919f2-kube-api-access-mrns9" (OuterVolumeSpecName: "kube-api-access-mrns9") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "kube-api-access-mrns9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.475121 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-scripts" (OuterVolumeSpecName: "scripts") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.533570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.542691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-config-data" (OuterVolumeSpecName: "config-data") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.562323 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrns9\" (UniqueName: \"kubernetes.io/projected/cc930999-5118-4423-a996-c11f390919f2-kube-api-access-mrns9\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.562350 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.562360 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.562369 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.573736 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.591029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cc930999-5118-4423-a996-c11f390919f2" (UID: "cc930999-5118-4423-a996-c11f390919f2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.663745 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.663778 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc930999-5118-4423-a996-c11f390919f2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.760448 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d5d974b56-qhhk8"] Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.768586 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6d5d974b56-qhhk8"] Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.980567 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:10:20 crc kubenswrapper[4749]: I0310 16:10:20.980621 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.355129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerStarted","Data":"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60"} Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.356804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" event={"ID":"9945ae2b-1140-4eb6-8212-c56f874dc891","Type":"ContainerStarted","Data":"3c8d84eda7a09a27be2cbbd7dd5b4c073f4fa684f122a6c3187ab2737bdf593d"} Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.381640 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" podStartSLOduration=2.2690204290000002 podStartE2EDuration="12.38162122s" podCreationTimestamp="2026-03-10 16:10:09 +0000 UTC" firstStartedPulling="2026-03-10 16:10:10.381164267 +0000 UTC m=+1307.503029954" lastFinishedPulling="2026-03-10 16:10:20.493765058 +0000 UTC m=+1317.615630745" observedRunningTime="2026-03-10 16:10:21.373580829 +0000 UTC m=+1318.495446526" watchObservedRunningTime="2026-03-10 16:10:21.38162122 +0000 UTC m=+1318.503486907" Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.498198 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.498311 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.512088 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 16:10:21 crc kubenswrapper[4749]: I0310 16:10:21.620724 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc930999-5118-4423-a996-c11f390919f2" path="/var/lib/kubelet/pods/cc930999-5118-4423-a996-c11f390919f2/volumes" Mar 10 16:10:23 crc kubenswrapper[4749]: I0310 16:10:23.379422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerStarted","Data":"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b"} Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.418244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerStarted","Data":"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9"} Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.418444 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-central-agent" containerID="cri-o://e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" gracePeriod=30 Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.418508 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="proxy-httpd" containerID="cri-o://b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" gracePeriod=30 Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.419104 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.418587 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-notification-agent" containerID="cri-o://7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" gracePeriod=30 Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.418627 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="sg-core" containerID="cri-o://a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" gracePeriod=30 Mar 10 16:10:26 crc kubenswrapper[4749]: I0310 16:10:26.467556 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.157371364 podStartE2EDuration="16.467536692s" podCreationTimestamp="2026-03-10 16:10:10 +0000 UTC" firstStartedPulling="2026-03-10 16:10:11.151040147 +0000 UTC m=+1308.272905834" lastFinishedPulling="2026-03-10 16:10:25.461205455 +0000 UTC m=+1322.583071162" observedRunningTime="2026-03-10 16:10:26.451062927 +0000 UTC m=+1323.572928624" watchObservedRunningTime="2026-03-10 16:10:26.467536692 +0000 UTC m=+1323.589402389" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.195991 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-log-httpd\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-ceilometer-tls-certs\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-combined-ca-bundle\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twnh2\" (UniqueName: \"kubernetes.io/projected/6165f576-f6e2-4cb6-a577-9d764e6880b1-kube-api-access-twnh2\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-config-data\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296384 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-run-httpd\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.297778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-scripts\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296893 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.296938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.297973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-sg-core-conf-yaml\") pod \"6165f576-f6e2-4cb6-a577-9d764e6880b1\" (UID: \"6165f576-f6e2-4cb6-a577-9d764e6880b1\") " Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.298733 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.298764 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165f576-f6e2-4cb6-a577-9d764e6880b1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.306578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-scripts" (OuterVolumeSpecName: "scripts") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.306804 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6165f576-f6e2-4cb6-a577-9d764e6880b1-kube-api-access-twnh2" (OuterVolumeSpecName: "kube-api-access-twnh2") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "kube-api-access-twnh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.353116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.366895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.396978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.401138 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.401184 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.401200 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twnh2\" (UniqueName: \"kubernetes.io/projected/6165f576-f6e2-4cb6-a577-9d764e6880b1-kube-api-access-twnh2\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.401239 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.401253 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.414278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-config-data" (OuterVolumeSpecName: "config-data") pod "6165f576-f6e2-4cb6-a577-9d764e6880b1" (UID: "6165f576-f6e2-4cb6-a577-9d764e6880b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.431149 4749 generic.go:334] "Generic (PLEG): container finished" podID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" exitCode=0 Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432080 4749 generic.go:334] "Generic (PLEG): container finished" podID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" exitCode=2 Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432173 4749 generic.go:334] "Generic (PLEG): container finished" podID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" exitCode=0 Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432315 4749 generic.go:334] "Generic (PLEG): container finished" podID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" exitCode=0 Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerDied","Data":"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9"} Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerDied","Data":"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b"} Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerDied","Data":"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60"} Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerDied","Data":"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531"} Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.432914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165f576-f6e2-4cb6-a577-9d764e6880b1","Type":"ContainerDied","Data":"255eb440ed4d1cd05e7dd2dbdd4e31d7724e2991a2bcff8a5beec226e18efb8c"} Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.433011 4749 scope.go:117] "RemoveContainer" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.433267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.485498 4749 scope.go:117] "RemoveContainer" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.505731 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165f576-f6e2-4cb6-a577-9d764e6880b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.505816 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.521604 4749 scope.go:117] "RemoveContainer" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.524608 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.559494 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560062 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-log" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-log" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560102 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="sg-core" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560110 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="sg-core" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560126 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-api" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560160 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-api" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560178 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="proxy-httpd" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560187 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="proxy-httpd" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560202 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-notification-agent" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560212 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-notification-agent" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560230 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-api" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560239 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-api" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560253 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-httpd" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560263 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-httpd" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.560291 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-central-agent" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560300 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-central-agent" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560564 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="sg-core" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560586 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-notification-agent" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560609 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-api" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560627 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-api" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560646 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="ceilometer-central-agent" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560659 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0893ff76-efa3-496c-b499-0f6e3a4ffd59" containerName="neutron-httpd" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560674 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc930999-5118-4423-a996-c11f390919f2" containerName="placement-log" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.560691 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" containerName="proxy-httpd" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.562811 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.565630 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.566196 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.566079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.574700 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.607686 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.607824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-scripts\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.607936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-run-httpd\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.607965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-log-httpd\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.608045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvkdb\" (UniqueName: \"kubernetes.io/projected/e580e649-77de-4c7c-9192-ba338b5fcd59-kube-api-access-pvkdb\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.608577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.608796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.608833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-config-data\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.619202 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6165f576-f6e2-4cb6-a577-9d764e6880b1" path="/var/lib/kubelet/pods/6165f576-f6e2-4cb6-a577-9d764e6880b1/volumes" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.624927 4749 scope.go:117] "RemoveContainer" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.652230 4749 scope.go:117] "RemoveContainer" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.653688 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": container with ID starting with b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9 not found: ID does not exist" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.653733 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9"} err="failed to get container status \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": rpc error: code = NotFound desc = could not find container \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": container with ID starting with b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.653764 4749 scope.go:117] "RemoveContainer" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.654351 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": container with ID starting with a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b not found: ID does not exist" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.654418 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b"} err="failed to get container status \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": rpc error: code = NotFound desc = could not find container \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": container with ID starting with a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.654455 4749 scope.go:117] "RemoveContainer" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.654852 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": container with ID starting with 7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60 not found: ID does not exist" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.654877 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60"} err="failed to get container status \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": rpc error: code = NotFound desc = could not find container \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": container with ID starting with 7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.654894 4749 scope.go:117] "RemoveContainer" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" Mar 10 16:10:27 crc kubenswrapper[4749]: E0310 16:10:27.656512 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": container with ID starting with e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531 not found: ID does not exist" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.656556 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531"} err="failed to get container status \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": rpc error: code = NotFound desc = could not find container \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": container with ID starting with e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.656578 4749 scope.go:117] "RemoveContainer" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.657089 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9"} err="failed to get container status \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": rpc error: code = NotFound desc = could not find container \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": container with ID starting with b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.657123 4749 scope.go:117] "RemoveContainer" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.657486 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b"} err="failed to get container status \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": rpc error: code = NotFound desc = could not find container \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": container with ID starting with a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.657529 4749 scope.go:117] "RemoveContainer" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.658127 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60"} err="failed to get container status \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": rpc error: code = NotFound desc = could not find container \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": container with ID starting with 7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.658155 4749 scope.go:117] "RemoveContainer" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.658474 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531"} err="failed to get container status \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": rpc error: code = NotFound desc = could not find container \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": container with ID starting with e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.658498 4749 scope.go:117] "RemoveContainer" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.658758 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9"} err="failed to get container status \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": rpc error: code = NotFound desc = could not find container \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": container with ID starting with b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.658779 4749 scope.go:117] "RemoveContainer" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.659072 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b"} err="failed to get container status \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": rpc error: code = NotFound desc = could not find container \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": container with ID starting with a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.659100 4749 scope.go:117] "RemoveContainer" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.659459 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60"} err="failed to get container status \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": rpc error: code = NotFound desc = could not find container \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": container with ID starting with 7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.659486 4749 scope.go:117] "RemoveContainer" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.659795 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531"} err="failed to get container status \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": rpc error: code = NotFound desc = could not find container \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": container with ID starting with e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.659823 4749 scope.go:117] "RemoveContainer" containerID="b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.660098 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9"} err="failed to get container status \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": rpc error: code = NotFound desc = could not find container \"b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9\": container with ID starting with b7d0fc8e6b65b2ac6ffb970bc71b9b18c771c37579452e7501eb77c84bcd62f9 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.660141 4749 scope.go:117] "RemoveContainer" containerID="a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.660542 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b"} err="failed to get container status \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": rpc error: code = NotFound desc = could not find container \"a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b\": container with ID starting with a69ce8a4900aa397000767b224d20b04d565cc352537d8912994d8956c7c947b not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.660572 4749 scope.go:117] "RemoveContainer" containerID="7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.660917 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60"} err="failed to get container status \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": rpc error: code = NotFound desc = could not find container \"7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60\": container with ID starting with 7b648b7fdfdbb4e62292c16d8d4a812c3c9c2f2bd9ce3f8cfddcc5f744fbab60 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.660947 4749 scope.go:117] "RemoveContainer" containerID="e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.662718 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531"} err="failed to get container status \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": rpc error: code = NotFound desc = could not find container \"e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531\": container with ID starting with e334ed83aac3f08c3e94de4ca0722d5709f82636b47f1f9a460140ff79f8d531 not found: ID does not exist" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.709972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-scripts\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-run-httpd\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-log-httpd\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvkdb\" (UniqueName: \"kubernetes.io/projected/e580e649-77de-4c7c-9192-ba338b5fcd59-kube-api-access-pvkdb\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.710361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-config-data\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.711749 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-log-httpd\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.711956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-run-httpd\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.716933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.716969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.716998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-scripts\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.717755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.717930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-config-data\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.731789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvkdb\" (UniqueName: \"kubernetes.io/projected/e580e649-77de-4c7c-9192-ba338b5fcd59-kube-api-access-pvkdb\") pod \"ceilometer-0\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " pod="openstack/ceilometer-0" Mar 10 16:10:27 crc kubenswrapper[4749]: I0310 16:10:27.920609 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:28 crc kubenswrapper[4749]: W0310 16:10:28.444641 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode580e649_77de_4c7c_9192_ba338b5fcd59.slice/crio-474b2cd1f63723920ba8ccab9ad6ba4e60684c87daaca691a2fce03b13a83291 WatchSource:0}: Error finding container 474b2cd1f63723920ba8ccab9ad6ba4e60684c87daaca691a2fce03b13a83291: Status 404 returned error can't find the container with id 474b2cd1f63723920ba8ccab9ad6ba4e60684c87daaca691a2fce03b13a83291 Mar 10 16:10:28 crc kubenswrapper[4749]: I0310 16:10:28.449714 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:29 crc kubenswrapper[4749]: I0310 16:10:29.461649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerStarted","Data":"474b2cd1f63723920ba8ccab9ad6ba4e60684c87daaca691a2fce03b13a83291"} Mar 10 16:10:31 crc kubenswrapper[4749]: I0310 16:10:31.491075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerStarted","Data":"d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62"} Mar 10 16:10:31 crc kubenswrapper[4749]: I0310 16:10:31.509831 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:32 crc kubenswrapper[4749]: I0310 16:10:32.509159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerStarted","Data":"68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6"} Mar 10 16:10:32 crc kubenswrapper[4749]: I0310 16:10:32.509937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerStarted","Data":"ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1"} Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.532176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerStarted","Data":"6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33"} Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.532862 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.532731 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="proxy-httpd" containerID="cri-o://6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33" gracePeriod=30 Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.532440 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-central-agent" containerID="cri-o://d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62" gracePeriod=30 Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.532756 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-notification-agent" containerID="cri-o://ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1" gracePeriod=30 Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.532745 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="sg-core" containerID="cri-o://68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6" gracePeriod=30 Mar 10 16:10:34 crc kubenswrapper[4749]: I0310 16:10:34.571317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8402782819999999 podStartE2EDuration="7.571292415s" podCreationTimestamp="2026-03-10 16:10:27 +0000 UTC" firstStartedPulling="2026-03-10 16:10:28.449049628 +0000 UTC m=+1325.570915325" lastFinishedPulling="2026-03-10 16:10:34.180063771 +0000 UTC m=+1331.301929458" observedRunningTime="2026-03-10 16:10:34.56202715 +0000 UTC m=+1331.683892837" watchObservedRunningTime="2026-03-10 16:10:34.571292415 +0000 UTC m=+1331.693158102" Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.565858 4749 generic.go:334] "Generic (PLEG): container finished" podID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerID="6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33" exitCode=0 Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.566151 4749 generic.go:334] "Generic (PLEG): container finished" podID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerID="68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6" exitCode=2 Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.566161 4749 generic.go:334] "Generic (PLEG): container finished" podID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerID="ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1" exitCode=0 Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.566170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerDied","Data":"6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33"} Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.566243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerDied","Data":"68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6"} Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.566267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerDied","Data":"ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1"} Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.569427 4749 generic.go:334] "Generic (PLEG): container finished" podID="9945ae2b-1140-4eb6-8212-c56f874dc891" containerID="3c8d84eda7a09a27be2cbbd7dd5b4c073f4fa684f122a6c3187ab2737bdf593d" exitCode=0 Mar 10 16:10:35 crc kubenswrapper[4749]: I0310 16:10:35.569461 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" event={"ID":"9945ae2b-1140-4eb6-8212-c56f874dc891","Type":"ContainerDied","Data":"3c8d84eda7a09a27be2cbbd7dd5b4c073f4fa684f122a6c3187ab2737bdf593d"} Mar 10 16:10:36 crc kubenswrapper[4749]: I0310 16:10:36.975258 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.015277 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwkfs\" (UniqueName: \"kubernetes.io/projected/9945ae2b-1140-4eb6-8212-c56f874dc891-kube-api-access-wwkfs\") pod \"9945ae2b-1140-4eb6-8212-c56f874dc891\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.015334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-config-data\") pod \"9945ae2b-1140-4eb6-8212-c56f874dc891\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.015404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-scripts\") pod \"9945ae2b-1140-4eb6-8212-c56f874dc891\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.015482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-combined-ca-bundle\") pod \"9945ae2b-1140-4eb6-8212-c56f874dc891\" (UID: \"9945ae2b-1140-4eb6-8212-c56f874dc891\") " Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.045048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-scripts" (OuterVolumeSpecName: "scripts") pod "9945ae2b-1140-4eb6-8212-c56f874dc891" (UID: "9945ae2b-1140-4eb6-8212-c56f874dc891"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.047767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9945ae2b-1140-4eb6-8212-c56f874dc891-kube-api-access-wwkfs" (OuterVolumeSpecName: "kube-api-access-wwkfs") pod "9945ae2b-1140-4eb6-8212-c56f874dc891" (UID: "9945ae2b-1140-4eb6-8212-c56f874dc891"). InnerVolumeSpecName "kube-api-access-wwkfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.069670 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-config-data" (OuterVolumeSpecName: "config-data") pod "9945ae2b-1140-4eb6-8212-c56f874dc891" (UID: "9945ae2b-1140-4eb6-8212-c56f874dc891"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.077860 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9945ae2b-1140-4eb6-8212-c56f874dc891" (UID: "9945ae2b-1140-4eb6-8212-c56f874dc891"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.117628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwkfs\" (UniqueName: \"kubernetes.io/projected/9945ae2b-1140-4eb6-8212-c56f874dc891-kube-api-access-wwkfs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.117673 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.117687 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.117699 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9945ae2b-1140-4eb6-8212-c56f874dc891-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.592783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" event={"ID":"9945ae2b-1140-4eb6-8212-c56f874dc891","Type":"ContainerDied","Data":"fecc09feefe2007c543532eed95154c998fb29a299d0a194c55cf797a397d514"} Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.593060 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fecc09feefe2007c543532eed95154c998fb29a299d0a194c55cf797a397d514" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.592950 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c6g4w" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.704896 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 16:10:37 crc kubenswrapper[4749]: E0310 16:10:37.705271 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9945ae2b-1140-4eb6-8212-c56f874dc891" containerName="nova-cell0-conductor-db-sync" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.705282 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9945ae2b-1140-4eb6-8212-c56f874dc891" containerName="nova-cell0-conductor-db-sync" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.705550 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9945ae2b-1140-4eb6-8212-c56f874dc891" containerName="nova-cell0-conductor-db-sync" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.706116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.709013 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.709190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qlxkb" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.729798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.729945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.729996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25n8c\" (UniqueName: \"kubernetes.io/projected/c31b4d97-4ea8-411f-873a-1ad6c133b917-kube-api-access-25n8c\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.737588 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.831340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.831461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.831486 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25n8c\" (UniqueName: \"kubernetes.io/projected/c31b4d97-4ea8-411f-873a-1ad6c133b917-kube-api-access-25n8c\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.835961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.837135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:37 crc kubenswrapper[4749]: I0310 16:10:37.850229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25n8c\" (UniqueName: \"kubernetes.io/projected/c31b4d97-4ea8-411f-873a-1ad6c133b917-kube-api-access-25n8c\") pod \"nova-cell0-conductor-0\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:38 crc kubenswrapper[4749]: I0310 16:10:38.037892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:38 crc kubenswrapper[4749]: I0310 16:10:38.550373 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 16:10:38 crc kubenswrapper[4749]: I0310 16:10:38.604666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31b4d97-4ea8-411f-873a-1ad6c133b917","Type":"ContainerStarted","Data":"ca2f6719329181c4c75d1619750a38a62fa9939486405a0d591f04e0ab04fa8f"} Mar 10 16:10:39 crc kubenswrapper[4749]: I0310 16:10:39.623374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31b4d97-4ea8-411f-873a-1ad6c133b917","Type":"ContainerStarted","Data":"f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5"} Mar 10 16:10:39 crc kubenswrapper[4749]: I0310 16:10:39.623949 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:39 crc kubenswrapper[4749]: I0310 16:10:39.653287 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.653263338 podStartE2EDuration="2.653263338s" podCreationTimestamp="2026-03-10 16:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:39.641149254 +0000 UTC m=+1336.763014941" watchObservedRunningTime="2026-03-10 16:10:39.653263338 +0000 UTC m=+1336.775129025" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.235743 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.283798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-ceilometer-tls-certs\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.283870 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-log-httpd\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.283906 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-run-httpd\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.283958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-config-data\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.284009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-scripts\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.284057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-combined-ca-bundle\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.284126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-sg-core-conf-yaml\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.284279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvkdb\" (UniqueName: \"kubernetes.io/projected/e580e649-77de-4c7c-9192-ba338b5fcd59-kube-api-access-pvkdb\") pod \"e580e649-77de-4c7c-9192-ba338b5fcd59\" (UID: \"e580e649-77de-4c7c-9192-ba338b5fcd59\") " Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.284577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.284625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.285147 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.285176 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e580e649-77de-4c7c-9192-ba338b5fcd59-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.288799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-scripts" (OuterVolumeSpecName: "scripts") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.290316 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e580e649-77de-4c7c-9192-ba338b5fcd59-kube-api-access-pvkdb" (OuterVolumeSpecName: "kube-api-access-pvkdb") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "kube-api-access-pvkdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.320155 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.350548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.385823 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.385854 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvkdb\" (UniqueName: \"kubernetes.io/projected/e580e649-77de-4c7c-9192-ba338b5fcd59-kube-api-access-pvkdb\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.385866 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.385875 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.388215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.418739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-config-data" (OuterVolumeSpecName: "config-data") pod "e580e649-77de-4c7c-9192-ba338b5fcd59" (UID: "e580e649-77de-4c7c-9192-ba338b5fcd59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.486812 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.486845 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e580e649-77de-4c7c-9192-ba338b5fcd59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.634202 4749 generic.go:334] "Generic (PLEG): container finished" podID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerID="d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62" exitCode=0 Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.634261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerDied","Data":"d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62"} Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.634308 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.634324 4749 scope.go:117] "RemoveContainer" containerID="6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.634309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e580e649-77de-4c7c-9192-ba338b5fcd59","Type":"ContainerDied","Data":"474b2cd1f63723920ba8ccab9ad6ba4e60684c87daaca691a2fce03b13a83291"} Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.679747 4749 scope.go:117] "RemoveContainer" containerID="68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.680009 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.758317 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.788948 4749 scope.go:117] "RemoveContainer" containerID="ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.810430 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.810820 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="sg-core" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.810836 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="sg-core" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.810848 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="proxy-httpd" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.810854 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="proxy-httpd" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.810880 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-central-agent" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.810886 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-central-agent" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.810898 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-notification-agent" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.810904 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-notification-agent" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.811060 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="sg-core" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.811074 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-notification-agent" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.811086 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="ceilometer-central-agent" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.811098 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" containerName="proxy-httpd" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.816276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.820383 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.820679 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.821337 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.843023 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.870895 4749 scope.go:117] "RemoveContainer" containerID="d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-run-httpd\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-log-httpd\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-scripts\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-config-data\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgk4z\" (UniqueName: \"kubernetes.io/projected/356bdd1f-5efb-4678-bb76-39d4720e16ba-kube-api-access-pgk4z\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.904698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.944986 4749 scope.go:117] "RemoveContainer" containerID="6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.945518 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33\": container with ID starting with 6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33 not found: ID does not exist" containerID="6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.945576 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33"} err="failed to get container status \"6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33\": rpc error: code = NotFound desc = could not find container \"6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33\": container with ID starting with 6180f096c5ae7989c778c5c1b56ceef1f8f19122465a49ded72d00c4ac75eb33 not found: ID does not exist" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.945598 4749 scope.go:117] "RemoveContainer" containerID="68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.945854 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6\": container with ID starting with 68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6 not found: ID does not exist" containerID="68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.945877 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6"} err="failed to get container status \"68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6\": rpc error: code = NotFound desc = could not find container \"68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6\": container with ID starting with 68ff7ce9e6808006f78dff400a44ff5eeab8c16df562c0f501a36cef61333ac6 not found: ID does not exist" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.945891 4749 scope.go:117] "RemoveContainer" containerID="ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.946148 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1\": container with ID starting with ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1 not found: ID does not exist" containerID="ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.946192 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1"} err="failed to get container status \"ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1\": rpc error: code = NotFound desc = could not find container \"ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1\": container with ID starting with ec08f287b242dbcd1940ce0dab6cc40bc4a4cf380353a18ec70ed10f9f2291b1 not found: ID does not exist" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.946207 4749 scope.go:117] "RemoveContainer" containerID="d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62" Mar 10 16:10:40 crc kubenswrapper[4749]: E0310 16:10:40.946547 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62\": container with ID starting with d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62 not found: ID does not exist" containerID="d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62" Mar 10 16:10:40 crc kubenswrapper[4749]: I0310 16:10:40.946604 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62"} err="failed to get container status \"d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62\": rpc error: code = NotFound desc = could not find container \"d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62\": container with ID starting with d4b792ab0bcc5705430176a45b8287b9d8a6cf96d11c84fae30937cf82b75e62 not found: ID does not exist" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-run-httpd\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-log-httpd\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-scripts\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-config-data\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgk4z\" (UniqueName: \"kubernetes.io/projected/356bdd1f-5efb-4678-bb76-39d4720e16ba-kube-api-access-pgk4z\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.006803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.007367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-log-httpd\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.008352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-run-httpd\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.011867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.011878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.012675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.013521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-config-data\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.014590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-scripts\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.029311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgk4z\" (UniqueName: \"kubernetes.io/projected/356bdd1f-5efb-4678-bb76-39d4720e16ba-kube-api-access-pgk4z\") pod \"ceilometer-0\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.149227 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.624296 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e580e649-77de-4c7c-9192-ba338b5fcd59" path="/var/lib/kubelet/pods/e580e649-77de-4c7c-9192-ba338b5fcd59/volumes" Mar 10 16:10:41 crc kubenswrapper[4749]: I0310 16:10:41.642258 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:10:41 crc kubenswrapper[4749]: W0310 16:10:41.648824 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod356bdd1f_5efb_4678_bb76_39d4720e16ba.slice/crio-49b6c34fd018d5b300c83514935a6ebd534395adf134b7766c7f77b3f3dbc3b6 WatchSource:0}: Error finding container 49b6c34fd018d5b300c83514935a6ebd534395adf134b7766c7f77b3f3dbc3b6: Status 404 returned error can't find the container with id 49b6c34fd018d5b300c83514935a6ebd534395adf134b7766c7f77b3f3dbc3b6 Mar 10 16:10:42 crc kubenswrapper[4749]: I0310 16:10:42.662995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerStarted","Data":"b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7"} Mar 10 16:10:42 crc kubenswrapper[4749]: I0310 16:10:42.663730 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerStarted","Data":"49b6c34fd018d5b300c83514935a6ebd534395adf134b7766c7f77b3f3dbc3b6"} Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.089124 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.638539 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bk27f"] Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.640601 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.650627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.650944 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.655267 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bk27f"] Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.658591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gjn\" (UniqueName: \"kubernetes.io/projected/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-kube-api-access-27gjn\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.658656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.658711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-scripts\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.658738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-config-data\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.701031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerStarted","Data":"272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8"} Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.761501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.761587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-scripts\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.761642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-config-data\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.761744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gjn\" (UniqueName: \"kubernetes.io/projected/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-kube-api-access-27gjn\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.788386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-scripts\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.789204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.801291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gjn\" (UniqueName: \"kubernetes.io/projected/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-kube-api-access-27gjn\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.809925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-config-data\") pod \"nova-cell0-cell-mapping-bk27f\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.894597 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.897927 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.908557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.910474 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.977851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.985702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-logs\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.985769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-config-data\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.985836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f9b2\" (UniqueName: \"kubernetes.io/projected/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-kube-api-access-8f9b2\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:43 crc kubenswrapper[4749]: I0310 16:10:43.985866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.056444 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.059108 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.066488 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.090787 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-config-data\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.091076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.092994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.093523 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvt56\" (UniqueName: \"kubernetes.io/projected/5b99cb14-8007-41ca-9c15-4af7fad17cf1-kube-api-access-fvt56\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.093620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f9b2\" (UniqueName: \"kubernetes.io/projected/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-kube-api-access-8f9b2\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.093688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.093727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b99cb14-8007-41ca-9c15-4af7fad17cf1-logs\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.093861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-logs\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.093895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-config-data\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.098120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-logs\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.114079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-config-data\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.123514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.125957 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.127532 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.132508 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.147105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f9b2\" (UniqueName: \"kubernetes.io/projected/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-kube-api-access-8f9b2\") pod \"nova-api-0\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.151591 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.152824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.157877 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.168459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.180740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.196857 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-pfsz9"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.198440 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b99cb14-8007-41ca-9c15-4af7fad17cf1-logs\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-config-data\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-config-data\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202730 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8zwj\" (UniqueName: \"kubernetes.io/projected/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-kube-api-access-b8zwj\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzj5h\" (UniqueName: \"kubernetes.io/projected/11b31489-22e6-4403-a5f3-9375a8ac4fef-kube-api-access-mzj5h\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvt56\" (UniqueName: \"kubernetes.io/projected/5b99cb14-8007-41ca-9c15-4af7fad17cf1-kube-api-access-fvt56\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.202968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.204193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b99cb14-8007-41ca-9c15-4af7fad17cf1-logs\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.225744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-config-data\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.226442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.233723 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-pfsz9"] Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.235485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvt56\" (UniqueName: \"kubernetes.io/projected/5b99cb14-8007-41ca-9c15-4af7fad17cf1-kube-api-access-fvt56\") pod \"nova-metadata-0\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.269074 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8zwj\" (UniqueName: \"kubernetes.io/projected/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-kube-api-access-b8zwj\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzj5h\" (UniqueName: \"kubernetes.io/projected/11b31489-22e6-4403-a5f3-9375a8ac4fef-kube-api-access-mzj5h\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.306963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-config\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.307049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.307137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qlk\" (UniqueName: \"kubernetes.io/projected/e30eb114-ece0-4fa1-ba0d-85a33de05463-kube-api-access-b8qlk\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.307180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.307223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-config-data\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.328510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.330602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.330990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.333175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-config-data\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.337438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8zwj\" (UniqueName: \"kubernetes.io/projected/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-kube-api-access-b8zwj\") pod \"nova-scheduler-0\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.361188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzj5h\" (UniqueName: \"kubernetes.io/projected/11b31489-22e6-4403-a5f3-9375a8ac4fef-kube-api-access-mzj5h\") pod \"nova-cell1-novncproxy-0\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.384413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.398035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.410000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.410084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.410136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.410193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-config\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.410311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qlk\" (UniqueName: \"kubernetes.io/projected/e30eb114-ece0-4fa1-ba0d-85a33de05463-kube-api-access-b8qlk\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.410344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.411493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.412116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.412309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-config\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.412769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.423645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.432414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qlk\" (UniqueName: \"kubernetes.io/projected/e30eb114-ece0-4fa1-ba0d-85a33de05463-kube-api-access-b8qlk\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.438630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5695c9cc-pfsz9\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.680475 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bk27f"] Mar 10 16:10:44 crc kubenswrapper[4749]: W0310 16:10:44.695935 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1424b4e_9b0e_4108_81e0_6adcd7ec34cc.slice/crio-78f65590e5075f5506d8c7fa4ec380e35bc3d0c421e41513eb2db1c8e3b0757e WatchSource:0}: Error finding container 78f65590e5075f5506d8c7fa4ec380e35bc3d0c421e41513eb2db1c8e3b0757e: Status 404 returned error can't find the container with id 78f65590e5075f5506d8c7fa4ec380e35bc3d0c421e41513eb2db1c8e3b0757e Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.722628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bk27f" event={"ID":"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc","Type":"ContainerStarted","Data":"78f65590e5075f5506d8c7fa4ec380e35bc3d0c421e41513eb2db1c8e3b0757e"} Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.725538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerStarted","Data":"481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9"} Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.741815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:44 crc kubenswrapper[4749]: W0310 16:10:44.931637 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c504bbd_1bcd_4d75_b879_1ec5e50116e2.slice/crio-b0a265455d1fa39f5e707f7c742e174818283d991222aa3f718aa741aadf4f87 WatchSource:0}: Error finding container b0a265455d1fa39f5e707f7c742e174818283d991222aa3f718aa741aadf4f87: Status 404 returned error can't find the container with id b0a265455d1fa39f5e707f7c742e174818283d991222aa3f718aa741aadf4f87 Mar 10 16:10:44 crc kubenswrapper[4749]: I0310 16:10:44.931893 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.042482 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sb5c8"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.043714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.046110 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.047332 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.055493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sb5c8"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.069218 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.133453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.133560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-config-data\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.133593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvk94\" (UniqueName: \"kubernetes.io/projected/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-kube-api-access-qvk94\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.133640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-scripts\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: W0310 16:10:45.155866 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b31489_22e6_4403_a5f3_9375a8ac4fef.slice/crio-226f3b33104ddf6e6f97b0e07dd61c3cf411de023055d10b4e47cb1f6e36a9ae WatchSource:0}: Error finding container 226f3b33104ddf6e6f97b0e07dd61c3cf411de023055d10b4e47cb1f6e36a9ae: Status 404 returned error can't find the container with id 226f3b33104ddf6e6f97b0e07dd61c3cf411de023055d10b4e47cb1f6e36a9ae Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.157088 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.196854 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.241817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-config-data\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.243502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvk94\" (UniqueName: \"kubernetes.io/projected/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-kube-api-access-qvk94\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.243674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-scripts\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.244166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.255605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.256161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-config-data\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.266961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvk94\" (UniqueName: \"kubernetes.io/projected/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-kube-api-access-qvk94\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.268704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-scripts\") pod \"nova-cell1-conductor-db-sync-sb5c8\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.336430 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-pfsz9"] Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.537253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.737336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c504bbd-1bcd-4d75-b879-1ec5e50116e2","Type":"ContainerStarted","Data":"b0a265455d1fa39f5e707f7c742e174818283d991222aa3f718aa741aadf4f87"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.738894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"11b31489-22e6-4403-a5f3-9375a8ac4fef","Type":"ContainerStarted","Data":"226f3b33104ddf6e6f97b0e07dd61c3cf411de023055d10b4e47cb1f6e36a9ae"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.740330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd","Type":"ContainerStarted","Data":"59ef5747157c8338459e8004009f9e330405f7e2e7fbb846f111e3e680d2f4ee"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.743691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bk27f" event={"ID":"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc","Type":"ContainerStarted","Data":"17f3949302b878f73539d2cfebec421f4f33ea6abd3c520c71fdcdad36b2d285"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.748342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b99cb14-8007-41ca-9c15-4af7fad17cf1","Type":"ContainerStarted","Data":"e58f10ca18703a4086b0e2da4e55f08465fd631f28a20f279b82ebb428d8bafa"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.750130 4749 generic.go:334] "Generic (PLEG): container finished" podID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerID="a96cb0fcbf7265ea7eab4e2c14814d62be59f1b97490561ea4c3657b140c4cff" exitCode=0 Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.750168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" event={"ID":"e30eb114-ece0-4fa1-ba0d-85a33de05463","Type":"ContainerDied","Data":"a96cb0fcbf7265ea7eab4e2c14814d62be59f1b97490561ea4c3657b140c4cff"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.750183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" event={"ID":"e30eb114-ece0-4fa1-ba0d-85a33de05463","Type":"ContainerStarted","Data":"1dfc49d426645f3c96d6c1d79858829cbd852a40f8e3ff4370a67615ee4565f7"} Mar 10 16:10:45 crc kubenswrapper[4749]: I0310 16:10:45.774837 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bk27f" podStartSLOduration=2.7747834940000002 podStartE2EDuration="2.774783494s" podCreationTimestamp="2026-03-10 16:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:45.757851127 +0000 UTC m=+1342.879716804" watchObservedRunningTime="2026-03-10 16:10:45.774783494 +0000 UTC m=+1342.896649181" Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.120956 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sb5c8"] Mar 10 16:10:46 crc kubenswrapper[4749]: W0310 16:10:46.134463 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda738c4f5_0130_4bf4_85a4_ebbf9188ac51.slice/crio-194ce1e4b9e7678c608d3bfa894bc317f939723744203b4b7f7d954ab20b7653 WatchSource:0}: Error finding container 194ce1e4b9e7678c608d3bfa894bc317f939723744203b4b7f7d954ab20b7653: Status 404 returned error can't find the container with id 194ce1e4b9e7678c608d3bfa894bc317f939723744203b4b7f7d954ab20b7653 Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.762828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" event={"ID":"a738c4f5-0130-4bf4-85a4-ebbf9188ac51","Type":"ContainerStarted","Data":"b0e71ba5febb6a769078d2e5e9beb76c2914f70feafac428695b051ac58e6b77"} Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.762884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" event={"ID":"a738c4f5-0130-4bf4-85a4-ebbf9188ac51","Type":"ContainerStarted","Data":"194ce1e4b9e7678c608d3bfa894bc317f939723744203b4b7f7d954ab20b7653"} Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.771063 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" event={"ID":"e30eb114-ece0-4fa1-ba0d-85a33de05463","Type":"ContainerStarted","Data":"d98ff629c8a6f49966351adb1d7774cf20efbbd1a83fc5945d726cf4ccbcc436"} Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.771242 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.781141 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" podStartSLOduration=1.7811150599999999 podStartE2EDuration="1.78111506s" podCreationTimestamp="2026-03-10 16:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:46.777212323 +0000 UTC m=+1343.899078010" watchObservedRunningTime="2026-03-10 16:10:46.78111506 +0000 UTC m=+1343.902980747" Mar 10 16:10:46 crc kubenswrapper[4749]: I0310 16:10:46.805505 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" podStartSLOduration=2.805473392 podStartE2EDuration="2.805473392s" podCreationTimestamp="2026-03-10 16:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:46.795838187 +0000 UTC m=+1343.917703874" watchObservedRunningTime="2026-03-10 16:10:46.805473392 +0000 UTC m=+1343.927339079" Mar 10 16:10:47 crc kubenswrapper[4749]: I0310 16:10:47.719456 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:10:47 crc kubenswrapper[4749]: I0310 16:10:47.741288 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:10:47 crc kubenswrapper[4749]: I0310 16:10:47.800565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerStarted","Data":"97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f"} Mar 10 16:10:47 crc kubenswrapper[4749]: I0310 16:10:47.801157 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:10:47 crc kubenswrapper[4749]: I0310 16:10:47.836788 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.16317576 podStartE2EDuration="7.836769137s" podCreationTimestamp="2026-03-10 16:10:40 +0000 UTC" firstStartedPulling="2026-03-10 16:10:41.652390299 +0000 UTC m=+1338.774255986" lastFinishedPulling="2026-03-10 16:10:46.325983676 +0000 UTC m=+1343.447849363" observedRunningTime="2026-03-10 16:10:47.833504088 +0000 UTC m=+1344.955369775" watchObservedRunningTime="2026-03-10 16:10:47.836769137 +0000 UTC m=+1344.958634824" Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.813007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b99cb14-8007-41ca-9c15-4af7fad17cf1","Type":"ContainerStarted","Data":"ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e"} Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.813820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b99cb14-8007-41ca-9c15-4af7fad17cf1","Type":"ContainerStarted","Data":"b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af"} Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.813683 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-metadata" containerID="cri-o://ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e" gracePeriod=30 Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.813106 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-log" containerID="cri-o://b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af" gracePeriod=30 Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.822297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c504bbd-1bcd-4d75-b879-1ec5e50116e2","Type":"ContainerStarted","Data":"29d0dd8e218096a816a7f52245ea13ccf5579cfa2e6bf9f4ec0038717968de36"} Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.822353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c504bbd-1bcd-4d75-b879-1ec5e50116e2","Type":"ContainerStarted","Data":"b118664ea44bf03b6fa23d6923c2074c6ec65d4c3fc8ea7d3ab0e04eed49dc8b"} Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.833357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"11b31489-22e6-4403-a5f3-9375a8ac4fef","Type":"ContainerStarted","Data":"98312c49efd18bb55edfdeec5639eeac9104f6971091f342401d0ecddce25c0b"} Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.834504 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="11b31489-22e6-4403-a5f3-9375a8ac4fef" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://98312c49efd18bb55edfdeec5639eeac9104f6971091f342401d0ecddce25c0b" gracePeriod=30 Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.835215 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.671442276 podStartE2EDuration="5.835204348s" podCreationTimestamp="2026-03-10 16:10:43 +0000 UTC" firstStartedPulling="2026-03-10 16:10:45.089390903 +0000 UTC m=+1342.211256590" lastFinishedPulling="2026-03-10 16:10:48.253152965 +0000 UTC m=+1345.375018662" observedRunningTime="2026-03-10 16:10:48.833025697 +0000 UTC m=+1345.954891404" watchObservedRunningTime="2026-03-10 16:10:48.835204348 +0000 UTC m=+1345.957070035" Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.851537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd","Type":"ContainerStarted","Data":"60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479"} Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.861595 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.552711822 podStartE2EDuration="5.861571834s" podCreationTimestamp="2026-03-10 16:10:43 +0000 UTC" firstStartedPulling="2026-03-10 16:10:44.936392205 +0000 UTC m=+1342.058257892" lastFinishedPulling="2026-03-10 16:10:48.245252197 +0000 UTC m=+1345.367117904" observedRunningTime="2026-03-10 16:10:48.855363903 +0000 UTC m=+1345.977229590" watchObservedRunningTime="2026-03-10 16:10:48.861571834 +0000 UTC m=+1345.983437521" Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.880617 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.802532619 podStartE2EDuration="4.880601299s" podCreationTimestamp="2026-03-10 16:10:44 +0000 UTC" firstStartedPulling="2026-03-10 16:10:45.17746736 +0000 UTC m=+1342.299333047" lastFinishedPulling="2026-03-10 16:10:48.25553603 +0000 UTC m=+1345.377401727" observedRunningTime="2026-03-10 16:10:48.875839177 +0000 UTC m=+1345.997704864" watchObservedRunningTime="2026-03-10 16:10:48.880601299 +0000 UTC m=+1346.002466986" Mar 10 16:10:48 crc kubenswrapper[4749]: I0310 16:10:48.905627 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.807261569 podStartE2EDuration="5.905608378s" podCreationTimestamp="2026-03-10 16:10:43 +0000 UTC" firstStartedPulling="2026-03-10 16:10:45.177058969 +0000 UTC m=+1342.298924656" lastFinishedPulling="2026-03-10 16:10:48.275405778 +0000 UTC m=+1345.397271465" observedRunningTime="2026-03-10 16:10:48.898678047 +0000 UTC m=+1346.020543734" watchObservedRunningTime="2026-03-10 16:10:48.905608378 +0000 UTC m=+1346.027474065" Mar 10 16:10:49 crc kubenswrapper[4749]: I0310 16:10:49.385165 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 16:10:49 crc kubenswrapper[4749]: I0310 16:10:49.385211 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 16:10:49 crc kubenswrapper[4749]: I0310 16:10:49.400446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:10:49 crc kubenswrapper[4749]: I0310 16:10:49.424862 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 16:10:49 crc kubenswrapper[4749]: I0310 16:10:49.869668 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerID="b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af" exitCode=143 Mar 10 16:10:49 crc kubenswrapper[4749]: I0310 16:10:49.869763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b99cb14-8007-41ca-9c15-4af7fad17cf1","Type":"ContainerDied","Data":"b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af"} Mar 10 16:10:50 crc kubenswrapper[4749]: I0310 16:10:50.980600 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:10:50 crc kubenswrapper[4749]: I0310 16:10:50.981864 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:10:50 crc kubenswrapper[4749]: I0310 16:10:50.982010 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:10:50 crc kubenswrapper[4749]: I0310 16:10:50.983013 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"106da756b634d444f1a07a98c656ecf91e046a9d0f74a54a7001a123a154d3af"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:10:50 crc kubenswrapper[4749]: I0310 16:10:50.983191 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://106da756b634d444f1a07a98c656ecf91e046a9d0f74a54a7001a123a154d3af" gracePeriod=600 Mar 10 16:10:51 crc kubenswrapper[4749]: I0310 16:10:51.892947 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="106da756b634d444f1a07a98c656ecf91e046a9d0f74a54a7001a123a154d3af" exitCode=0 Mar 10 16:10:51 crc kubenswrapper[4749]: I0310 16:10:51.893017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"106da756b634d444f1a07a98c656ecf91e046a9d0f74a54a7001a123a154d3af"} Mar 10 16:10:51 crc kubenswrapper[4749]: I0310 16:10:51.893731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"f872d8a7b451ffda56ecf850ae4189e8cc776f81c01a3e3b24b44b064505fbb1"} Mar 10 16:10:51 crc kubenswrapper[4749]: I0310 16:10:51.893757 4749 scope.go:117] "RemoveContainer" containerID="8df1beddcbbe4b28bedf74a692eb90fcfbb0b66981e27d81e53ac5b8485c3d4f" Mar 10 16:10:53 crc kubenswrapper[4749]: I0310 16:10:53.913909 4749 generic.go:334] "Generic (PLEG): container finished" podID="a738c4f5-0130-4bf4-85a4-ebbf9188ac51" containerID="b0e71ba5febb6a769078d2e5e9beb76c2914f70feafac428695b051ac58e6b77" exitCode=0 Mar 10 16:10:53 crc kubenswrapper[4749]: I0310 16:10:53.913980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" event={"ID":"a738c4f5-0130-4bf4-85a4-ebbf9188ac51","Type":"ContainerDied","Data":"b0e71ba5febb6a769078d2e5e9beb76c2914f70feafac428695b051ac58e6b77"} Mar 10 16:10:53 crc kubenswrapper[4749]: I0310 16:10:53.916119 4749 generic.go:334] "Generic (PLEG): container finished" podID="b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" containerID="17f3949302b878f73539d2cfebec421f4f33ea6abd3c520c71fdcdad36b2d285" exitCode=0 Mar 10 16:10:53 crc kubenswrapper[4749]: I0310 16:10:53.916157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bk27f" event={"ID":"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc","Type":"ContainerDied","Data":"17f3949302b878f73539d2cfebec421f4f33ea6abd3c520c71fdcdad36b2d285"} Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.269704 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.270211 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.425411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.457301 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.743656 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.820220 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-c77bn"] Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.827194 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" podUID="85834161-43ab-465b-bb71-811ed69c132b" containerName="dnsmasq-dns" containerID="cri-o://8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030" gracePeriod=10 Mar 10 16:10:54 crc kubenswrapper[4749]: I0310 16:10:54.966865 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.350460 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.357565 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.357897 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.492207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-scripts\") pod \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.492259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gjn\" (UniqueName: \"kubernetes.io/projected/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-kube-api-access-27gjn\") pod \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.492408 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-config-data\") pod \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.492455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-combined-ca-bundle\") pod \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\" (UID: \"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.498993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-kube-api-access-27gjn" (OuterVolumeSpecName: "kube-api-access-27gjn") pod "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" (UID: "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc"). InnerVolumeSpecName "kube-api-access-27gjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.499856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-scripts" (OuterVolumeSpecName: "scripts") pod "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" (UID: "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.516246 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.523680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-config-data" (OuterVolumeSpecName: "config-data") pod "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" (UID: "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.524495 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" (UID: "b1424b4e-9b0e-4108-81e0-6adcd7ec34cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.524966 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.593937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-config\") pod \"85834161-43ab-465b-bb71-811ed69c132b\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-swift-storage-0\") pod \"85834161-43ab-465b-bb71-811ed69c132b\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-scripts\") pod \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-svc\") pod \"85834161-43ab-465b-bb71-811ed69c132b\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594142 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-sb\") pod \"85834161-43ab-465b-bb71-811ed69c132b\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd5x2\" (UniqueName: \"kubernetes.io/projected/85834161-43ab-465b-bb71-811ed69c132b-kube-api-access-bd5x2\") pod \"85834161-43ab-465b-bb71-811ed69c132b\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-nb\") pod \"85834161-43ab-465b-bb71-811ed69c132b\" (UID: \"85834161-43ab-465b-bb71-811ed69c132b\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-combined-ca-bundle\") pod \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-config-data\") pod \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvk94\" (UniqueName: \"kubernetes.io/projected/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-kube-api-access-qvk94\") pod \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\" (UID: \"a738c4f5-0130-4bf4-85a4-ebbf9188ac51\") " Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594750 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594766 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gjn\" (UniqueName: \"kubernetes.io/projected/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-kube-api-access-27gjn\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594775 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.594784 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.600559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-kube-api-access-qvk94" (OuterVolumeSpecName: "kube-api-access-qvk94") pod "a738c4f5-0130-4bf4-85a4-ebbf9188ac51" (UID: "a738c4f5-0130-4bf4-85a4-ebbf9188ac51"). InnerVolumeSpecName "kube-api-access-qvk94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.616193 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-scripts" (OuterVolumeSpecName: "scripts") pod "a738c4f5-0130-4bf4-85a4-ebbf9188ac51" (UID: "a738c4f5-0130-4bf4-85a4-ebbf9188ac51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.617378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85834161-43ab-465b-bb71-811ed69c132b-kube-api-access-bd5x2" (OuterVolumeSpecName: "kube-api-access-bd5x2") pod "85834161-43ab-465b-bb71-811ed69c132b" (UID: "85834161-43ab-465b-bb71-811ed69c132b"). InnerVolumeSpecName "kube-api-access-bd5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.629545 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a738c4f5-0130-4bf4-85a4-ebbf9188ac51" (UID: "a738c4f5-0130-4bf4-85a4-ebbf9188ac51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.645547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-config-data" (OuterVolumeSpecName: "config-data") pod "a738c4f5-0130-4bf4-85a4-ebbf9188ac51" (UID: "a738c4f5-0130-4bf4-85a4-ebbf9188ac51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.656834 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85834161-43ab-465b-bb71-811ed69c132b" (UID: "85834161-43ab-465b-bb71-811ed69c132b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.657482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85834161-43ab-465b-bb71-811ed69c132b" (UID: "85834161-43ab-465b-bb71-811ed69c132b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.657988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "85834161-43ab-465b-bb71-811ed69c132b" (UID: "85834161-43ab-465b-bb71-811ed69c132b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.664789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-config" (OuterVolumeSpecName: "config") pod "85834161-43ab-465b-bb71-811ed69c132b" (UID: "85834161-43ab-465b-bb71-811ed69c132b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.669452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85834161-43ab-465b-bb71-811ed69c132b" (UID: "85834161-43ab-465b-bb71-811ed69c132b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696283 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696321 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696333 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696343 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696351 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd5x2\" (UniqueName: \"kubernetes.io/projected/85834161-43ab-465b-bb71-811ed69c132b-kube-api-access-bd5x2\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696361 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696369 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696389 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696397 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvk94\" (UniqueName: \"kubernetes.io/projected/a738c4f5-0130-4bf4-85a4-ebbf9188ac51-kube-api-access-qvk94\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.696406 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85834161-43ab-465b-bb71-811ed69c132b-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.937351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" event={"ID":"a738c4f5-0130-4bf4-85a4-ebbf9188ac51","Type":"ContainerDied","Data":"194ce1e4b9e7678c608d3bfa894bc317f939723744203b4b7f7d954ab20b7653"} Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.937757 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="194ce1e4b9e7678c608d3bfa894bc317f939723744203b4b7f7d954ab20b7653" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.937819 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sb5c8" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.954210 4749 generic.go:334] "Generic (PLEG): container finished" podID="85834161-43ab-465b-bb71-811ed69c132b" containerID="8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030" exitCode=0 Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.954280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" event={"ID":"85834161-43ab-465b-bb71-811ed69c132b","Type":"ContainerDied","Data":"8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030"} Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.954311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" event={"ID":"85834161-43ab-465b-bb71-811ed69c132b","Type":"ContainerDied","Data":"b3a4311e697c5e0b6418409fe3abfc8d0559e829862b4ecfd946afd7ead56ba6"} Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.954331 4749 scope.go:117] "RemoveContainer" containerID="8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.954476 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dc97b969-c77bn" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.964269 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bk27f" Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.964263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bk27f" event={"ID":"b1424b4e-9b0e-4108-81e0-6adcd7ec34cc","Type":"ContainerDied","Data":"78f65590e5075f5506d8c7fa4ec380e35bc3d0c421e41513eb2db1c8e3b0757e"} Mar 10 16:10:55 crc kubenswrapper[4749]: I0310 16:10:55.964349 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78f65590e5075f5506d8c7fa4ec380e35bc3d0c421e41513eb2db1c8e3b0757e" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.049860 4749 scope.go:117] "RemoveContainer" containerID="892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.056602 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 16:10:56 crc kubenswrapper[4749]: E0310 16:10:56.057114 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85834161-43ab-465b-bb71-811ed69c132b" containerName="init" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.057131 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85834161-43ab-465b-bb71-811ed69c132b" containerName="init" Mar 10 16:10:56 crc kubenswrapper[4749]: E0310 16:10:56.057164 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a738c4f5-0130-4bf4-85a4-ebbf9188ac51" containerName="nova-cell1-conductor-db-sync" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.057173 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a738c4f5-0130-4bf4-85a4-ebbf9188ac51" containerName="nova-cell1-conductor-db-sync" Mar 10 16:10:56 crc kubenswrapper[4749]: E0310 16:10:56.057195 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85834161-43ab-465b-bb71-811ed69c132b" containerName="dnsmasq-dns" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.057203 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85834161-43ab-465b-bb71-811ed69c132b" containerName="dnsmasq-dns" Mar 10 16:10:56 crc kubenswrapper[4749]: E0310 16:10:56.057224 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" containerName="nova-manage" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.057232 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" containerName="nova-manage" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.061357 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a738c4f5-0130-4bf4-85a4-ebbf9188ac51" containerName="nova-cell1-conductor-db-sync" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.061409 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" containerName="nova-manage" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.061427 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85834161-43ab-465b-bb71-811ed69c132b" containerName="dnsmasq-dns" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.062134 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.065261 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.072209 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.109591 4749 scope.go:117] "RemoveContainer" containerID="8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.110992 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-c77bn"] Mar 10 16:10:56 crc kubenswrapper[4749]: E0310 16:10:56.116679 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030\": container with ID starting with 8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030 not found: ID does not exist" containerID="8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.116725 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030"} err="failed to get container status \"8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030\": rpc error: code = NotFound desc = could not find container \"8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030\": container with ID starting with 8914d90010a8494bbf6ad9346d0b25d6a194b8e682a59f9b058c540a1a071030 not found: ID does not exist" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.116756 4749 scope.go:117] "RemoveContainer" containerID="892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f" Mar 10 16:10:56 crc kubenswrapper[4749]: E0310 16:10:56.117199 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f\": container with ID starting with 892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f not found: ID does not exist" containerID="892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.117226 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f"} err="failed to get container status \"892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f\": rpc error: code = NotFound desc = could not find container \"892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f\": container with ID starting with 892aaf5f6d48acc7761f831589c515f4a6698ac0529e229f1a6febfd5df4373f not found: ID does not exist" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.124189 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86dc97b969-c77bn"] Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.209785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4j2\" (UniqueName: \"kubernetes.io/projected/e80985ef-0a5d-403a-b351-c59bd878723d-kube-api-access-fz4j2\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.209926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.210080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.250928 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.251156 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-log" containerID="cri-o://b118664ea44bf03b6fa23d6923c2074c6ec65d4c3fc8ea7d3ab0e04eed49dc8b" gracePeriod=30 Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.251643 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-api" containerID="cri-o://29d0dd8e218096a816a7f52245ea13ccf5579cfa2e6bf9f4ec0038717968de36" gracePeriod=30 Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.269844 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.311700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4j2\" (UniqueName: \"kubernetes.io/projected/e80985ef-0a5d-403a-b351-c59bd878723d-kube-api-access-fz4j2\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.311764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.311833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.318095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.320972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.335251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4j2\" (UniqueName: \"kubernetes.io/projected/e80985ef-0a5d-403a-b351-c59bd878723d-kube-api-access-fz4j2\") pod \"nova-cell1-conductor-0\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.451334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.929898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 16:10:56 crc kubenswrapper[4749]: W0310 16:10:56.930448 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode80985ef_0a5d_403a_b351_c59bd878723d.slice/crio-d35a092ca12c6db810580c404110b4910cbd51eb33e17d8f5d197bbe075f83d6 WatchSource:0}: Error finding container d35a092ca12c6db810580c404110b4910cbd51eb33e17d8f5d197bbe075f83d6: Status 404 returned error can't find the container with id d35a092ca12c6db810580c404110b4910cbd51eb33e17d8f5d197bbe075f83d6 Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.985281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e80985ef-0a5d-403a-b351-c59bd878723d","Type":"ContainerStarted","Data":"d35a092ca12c6db810580c404110b4910cbd51eb33e17d8f5d197bbe075f83d6"} Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.988784 4749 generic.go:334] "Generic (PLEG): container finished" podID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerID="b118664ea44bf03b6fa23d6923c2074c6ec65d4c3fc8ea7d3ab0e04eed49dc8b" exitCode=143 Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.988992 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" containerName="nova-scheduler-scheduler" containerID="cri-o://60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479" gracePeriod=30 Mar 10 16:10:56 crc kubenswrapper[4749]: I0310 16:10:56.989282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c504bbd-1bcd-4d75-b879-1ec5e50116e2","Type":"ContainerDied","Data":"b118664ea44bf03b6fa23d6923c2074c6ec65d4c3fc8ea7d3ab0e04eed49dc8b"} Mar 10 16:10:57 crc kubenswrapper[4749]: I0310 16:10:57.622058 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85834161-43ab-465b-bb71-811ed69c132b" path="/var/lib/kubelet/pods/85834161-43ab-465b-bb71-811ed69c132b/volumes" Mar 10 16:10:57 crc kubenswrapper[4749]: I0310 16:10:57.998233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e80985ef-0a5d-403a-b351-c59bd878723d","Type":"ContainerStarted","Data":"c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e"} Mar 10 16:10:57 crc kubenswrapper[4749]: I0310 16:10:57.998411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 16:10:58 crc kubenswrapper[4749]: I0310 16:10:58.025330 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.025308544 podStartE2EDuration="2.025308544s" podCreationTimestamp="2026-03-10 16:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:10:58.013480438 +0000 UTC m=+1355.135346145" watchObservedRunningTime="2026-03-10 16:10:58.025308544 +0000 UTC m=+1355.147174241" Mar 10 16:10:59 crc kubenswrapper[4749]: E0310 16:10:59.426978 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:10:59 crc kubenswrapper[4749]: E0310 16:10:59.430564 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:10:59 crc kubenswrapper[4749]: E0310 16:10:59.433650 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:10:59 crc kubenswrapper[4749]: E0310 16:10:59.433893 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" containerName="nova-scheduler-scheduler" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.019633 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" containerID="60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479" exitCode=0 Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.019689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd","Type":"ContainerDied","Data":"60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479"} Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.331229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.392327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-combined-ca-bundle\") pod \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.393200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-config-data\") pod \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.393624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8zwj\" (UniqueName: \"kubernetes.io/projected/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-kube-api-access-b8zwj\") pod \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\" (UID: \"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd\") " Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.398615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-kube-api-access-b8zwj" (OuterVolumeSpecName: "kube-api-access-b8zwj") pod "3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" (UID: "3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd"). InnerVolumeSpecName "kube-api-access-b8zwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.425633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" (UID: "3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.441135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-config-data" (OuterVolumeSpecName: "config-data") pod "3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" (UID: "3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.496004 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.496046 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8zwj\" (UniqueName: \"kubernetes.io/projected/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-kube-api-access-b8zwj\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:00 crc kubenswrapper[4749]: I0310 16:11:00.496062 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.042224 4749 generic.go:334] "Generic (PLEG): container finished" podID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerID="29d0dd8e218096a816a7f52245ea13ccf5579cfa2e6bf9f4ec0038717968de36" exitCode=0 Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.042784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c504bbd-1bcd-4d75-b879-1ec5e50116e2","Type":"ContainerDied","Data":"29d0dd8e218096a816a7f52245ea13ccf5579cfa2e6bf9f4ec0038717968de36"} Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.045607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd","Type":"ContainerDied","Data":"59ef5747157c8338459e8004009f9e330405f7e2e7fbb846f111e3e680d2f4ee"} Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.046450 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.046720 4749 scope.go:117] "RemoveContainer" containerID="60e9adf056e23c215faeea578a02083d96813af621e53bf690d7b60f0e0a0479" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.157570 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.170570 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.179095 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.209083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f9b2\" (UniqueName: \"kubernetes.io/projected/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-kube-api-access-8f9b2\") pod \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.209159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-logs\") pod \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.209203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-combined-ca-bundle\") pod \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.209228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-config-data\") pod \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\" (UID: \"0c504bbd-1bcd-4d75-b879-1ec5e50116e2\") " Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.210503 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-logs" (OuterVolumeSpecName: "logs") pod "0c504bbd-1bcd-4d75-b879-1ec5e50116e2" (UID: "0c504bbd-1bcd-4d75-b879-1ec5e50116e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.218478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:01 crc kubenswrapper[4749]: E0310 16:11:01.218941 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-api" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.218963 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-api" Mar 10 16:11:01 crc kubenswrapper[4749]: E0310 16:11:01.218989 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" containerName="nova-scheduler-scheduler" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.218998 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" containerName="nova-scheduler-scheduler" Mar 10 16:11:01 crc kubenswrapper[4749]: E0310 16:11:01.219018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-log" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.219026 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-log" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.219263 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-log" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.219280 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" containerName="nova-scheduler-scheduler" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.219293 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" containerName="nova-api-api" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.220017 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.229990 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.240221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-kube-api-access-8f9b2" (OuterVolumeSpecName: "kube-api-access-8f9b2") pod "0c504bbd-1bcd-4d75-b879-1ec5e50116e2" (UID: "0c504bbd-1bcd-4d75-b879-1ec5e50116e2"). InnerVolumeSpecName "kube-api-access-8f9b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.246910 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.250069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-config-data" (OuterVolumeSpecName: "config-data") pod "0c504bbd-1bcd-4d75-b879-1ec5e50116e2" (UID: "0c504bbd-1bcd-4d75-b879-1ec5e50116e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.252925 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c504bbd-1bcd-4d75-b879-1ec5e50116e2" (UID: "0c504bbd-1bcd-4d75-b879-1ec5e50116e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm7xv\" (UniqueName: \"kubernetes.io/projected/45ba94ea-09bc-4752-b93f-edf9ede8b871-kube-api-access-qm7xv\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-config-data\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311939 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f9b2\" (UniqueName: \"kubernetes.io/projected/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-kube-api-access-8f9b2\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311962 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311980 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.311998 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c504bbd-1bcd-4d75-b879-1ec5e50116e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.414037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm7xv\" (UniqueName: \"kubernetes.io/projected/45ba94ea-09bc-4752-b93f-edf9ede8b871-kube-api-access-qm7xv\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.414135 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-config-data\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.414238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.418127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-config-data\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.418328 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.431684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm7xv\" (UniqueName: \"kubernetes.io/projected/45ba94ea-09bc-4752-b93f-edf9ede8b871-kube-api-access-qm7xv\") pod \"nova-scheduler-0\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.618071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:01 crc kubenswrapper[4749]: I0310 16:11:01.623612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd" path="/var/lib/kubelet/pods/3e8f2dad-b0d6-4c12-a36f-f3bdb82e08cd/volumes" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.056680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c504bbd-1bcd-4d75-b879-1ec5e50116e2","Type":"ContainerDied","Data":"b0a265455d1fa39f5e707f7c742e174818283d991222aa3f718aa741aadf4f87"} Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.057274 4749 scope.go:117] "RemoveContainer" containerID="29d0dd8e218096a816a7f52245ea13ccf5579cfa2e6bf9f4ec0038717968de36" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.056897 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.093679 4749 scope.go:117] "RemoveContainer" containerID="b118664ea44bf03b6fa23d6923c2074c6ec65d4c3fc8ea7d3ab0e04eed49dc8b" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.106360 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.123031 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.147718 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.149767 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.153082 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.161990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.174878 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.240448 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-logs\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.240529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.240793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdh6l\" (UniqueName: \"kubernetes.io/projected/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-kube-api-access-zdh6l\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.240869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-config-data\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.343117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdh6l\" (UniqueName: \"kubernetes.io/projected/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-kube-api-access-zdh6l\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.343179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-config-data\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.343242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-logs\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.343266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.343832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-logs\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.348320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-config-data\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.356241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.364646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdh6l\" (UniqueName: \"kubernetes.io/projected/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-kube-api-access-zdh6l\") pod \"nova-api-0\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " pod="openstack/nova-api-0" Mar 10 16:11:02 crc kubenswrapper[4749]: I0310 16:11:02.583463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:03 crc kubenswrapper[4749]: I0310 16:11:03.073624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45ba94ea-09bc-4752-b93f-edf9ede8b871","Type":"ContainerStarted","Data":"fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66"} Mar 10 16:11:03 crc kubenswrapper[4749]: I0310 16:11:03.074166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45ba94ea-09bc-4752-b93f-edf9ede8b871","Type":"ContainerStarted","Data":"a24acf31768b0d11ee0a66f2ba6fc04e5be01cce73484aad86cb144477e538dd"} Mar 10 16:11:03 crc kubenswrapper[4749]: I0310 16:11:03.078504 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:03 crc kubenswrapper[4749]: I0310 16:11:03.624807 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c504bbd-1bcd-4d75-b879-1ec5e50116e2" path="/var/lib/kubelet/pods/0c504bbd-1bcd-4d75-b879-1ec5e50116e2/volumes" Mar 10 16:11:03 crc kubenswrapper[4749]: I0310 16:11:03.638995 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.638976213 podStartE2EDuration="2.638976213s" podCreationTimestamp="2026-03-10 16:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:03.102136386 +0000 UTC m=+1360.224002083" watchObservedRunningTime="2026-03-10 16:11:03.638976213 +0000 UTC m=+1360.760841900" Mar 10 16:11:04 crc kubenswrapper[4749]: I0310 16:11:04.101993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933","Type":"ContainerStarted","Data":"1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3"} Mar 10 16:11:04 crc kubenswrapper[4749]: I0310 16:11:04.102478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933","Type":"ContainerStarted","Data":"6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4"} Mar 10 16:11:04 crc kubenswrapper[4749]: I0310 16:11:04.102496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933","Type":"ContainerStarted","Data":"a6fdd3666d75a4faeb60443aafea130153f75d48a6a5d6bf60c64eff934fd3b9"} Mar 10 16:11:04 crc kubenswrapper[4749]: I0310 16:11:04.127745 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.127719792 podStartE2EDuration="2.127719792s" podCreationTimestamp="2026-03-10 16:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:04.120011859 +0000 UTC m=+1361.241877556" watchObservedRunningTime="2026-03-10 16:11:04.127719792 +0000 UTC m=+1361.249585489" Mar 10 16:11:06 crc kubenswrapper[4749]: I0310 16:11:06.493159 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 16:11:06 crc kubenswrapper[4749]: I0310 16:11:06.618910 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 16:11:11 crc kubenswrapper[4749]: I0310 16:11:11.172198 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 16:11:11 crc kubenswrapper[4749]: I0310 16:11:11.635448 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 16:11:11 crc kubenswrapper[4749]: I0310 16:11:11.666309 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 16:11:12 crc kubenswrapper[4749]: I0310 16:11:12.215931 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 16:11:12 crc kubenswrapper[4749]: I0310 16:11:12.583696 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 16:11:12 crc kubenswrapper[4749]: I0310 16:11:12.583752 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 16:11:13 crc kubenswrapper[4749]: I0310 16:11:13.666633 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:13 crc kubenswrapper[4749]: I0310 16:11:13.666669 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.211615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.255479 4749 generic.go:334] "Generic (PLEG): container finished" podID="11b31489-22e6-4403-a5f3-9375a8ac4fef" containerID="98312c49efd18bb55edfdeec5639eeac9104f6971091f342401d0ecddce25c0b" exitCode=137 Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.255588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"11b31489-22e6-4403-a5f3-9375a8ac4fef","Type":"ContainerDied","Data":"98312c49efd18bb55edfdeec5639eeac9104f6971091f342401d0ecddce25c0b"} Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.257084 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerID="ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e" exitCode=137 Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.257112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b99cb14-8007-41ca-9c15-4af7fad17cf1","Type":"ContainerDied","Data":"ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e"} Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.257137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b99cb14-8007-41ca-9c15-4af7fad17cf1","Type":"ContainerDied","Data":"e58f10ca18703a4086b0e2da4e55f08465fd631f28a20f279b82ebb428d8bafa"} Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.257161 4749 scope.go:117] "RemoveContainer" containerID="ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.257389 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.281798 4749 scope.go:117] "RemoveContainer" containerID="b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.296850 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-config-data\") pod \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.296904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-combined-ca-bundle\") pod \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.297053 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b99cb14-8007-41ca-9c15-4af7fad17cf1-logs\") pod \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.297132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvt56\" (UniqueName: \"kubernetes.io/projected/5b99cb14-8007-41ca-9c15-4af7fad17cf1-kube-api-access-fvt56\") pod \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\" (UID: \"5b99cb14-8007-41ca-9c15-4af7fad17cf1\") " Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.297924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b99cb14-8007-41ca-9c15-4af7fad17cf1-logs" (OuterVolumeSpecName: "logs") pod "5b99cb14-8007-41ca-9c15-4af7fad17cf1" (UID: "5b99cb14-8007-41ca-9c15-4af7fad17cf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.303561 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b99cb14-8007-41ca-9c15-4af7fad17cf1-kube-api-access-fvt56" (OuterVolumeSpecName: "kube-api-access-fvt56") pod "5b99cb14-8007-41ca-9c15-4af7fad17cf1" (UID: "5b99cb14-8007-41ca-9c15-4af7fad17cf1"). InnerVolumeSpecName "kube-api-access-fvt56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.328365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b99cb14-8007-41ca-9c15-4af7fad17cf1" (UID: "5b99cb14-8007-41ca-9c15-4af7fad17cf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.330938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-config-data" (OuterVolumeSpecName: "config-data") pod "5b99cb14-8007-41ca-9c15-4af7fad17cf1" (UID: "5b99cb14-8007-41ca-9c15-4af7fad17cf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.350258 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.361742 4749 scope.go:117] "RemoveContainer" containerID="ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e" Mar 10 16:11:19 crc kubenswrapper[4749]: E0310 16:11:19.362413 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e\": container with ID starting with ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e not found: ID does not exist" containerID="ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.362460 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e"} err="failed to get container status \"ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e\": rpc error: code = NotFound desc = could not find container \"ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e\": container with ID starting with ac776a91b99c2197d172d9af7762ff9be871c1e6e29495dc0e8af2bd81ac9a0e not found: ID does not exist" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.362486 4749 scope.go:117] "RemoveContainer" containerID="b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af" Mar 10 16:11:19 crc kubenswrapper[4749]: E0310 16:11:19.362852 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af\": container with ID starting with b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af not found: ID does not exist" containerID="b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.362917 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af"} err="failed to get container status \"b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af\": rpc error: code = NotFound desc = could not find container \"b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af\": container with ID starting with b369383240a273cbafd6ce35533215726417c106a005d8eec338353a886b07af not found: ID does not exist" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.399456 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.399499 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b99cb14-8007-41ca-9c15-4af7fad17cf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.399516 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b99cb14-8007-41ca-9c15-4af7fad17cf1-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.399527 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvt56\" (UniqueName: \"kubernetes.io/projected/5b99cb14-8007-41ca-9c15-4af7fad17cf1-kube-api-access-fvt56\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.500350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzj5h\" (UniqueName: \"kubernetes.io/projected/11b31489-22e6-4403-a5f3-9375a8ac4fef-kube-api-access-mzj5h\") pod \"11b31489-22e6-4403-a5f3-9375a8ac4fef\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.500592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-combined-ca-bundle\") pod \"11b31489-22e6-4403-a5f3-9375a8ac4fef\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " Mar 10 16:11:19 crc kubenswrapper[4749]: I0310 16:11:19.500649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-config-data\") pod \"11b31489-22e6-4403-a5f3-9375a8ac4fef\" (UID: \"11b31489-22e6-4403-a5f3-9375a8ac4fef\") " Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.503611 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b31489-22e6-4403-a5f3-9375a8ac4fef-kube-api-access-mzj5h" (OuterVolumeSpecName: "kube-api-access-mzj5h") pod "11b31489-22e6-4403-a5f3-9375a8ac4fef" (UID: "11b31489-22e6-4403-a5f3-9375a8ac4fef"). InnerVolumeSpecName "kube-api-access-mzj5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.532146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-config-data" (OuterVolumeSpecName: "config-data") pod "11b31489-22e6-4403-a5f3-9375a8ac4fef" (UID: "11b31489-22e6-4403-a5f3-9375a8ac4fef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.540793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11b31489-22e6-4403-a5f3-9375a8ac4fef" (UID: "11b31489-22e6-4403-a5f3-9375a8ac4fef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.602656 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.602683 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b31489-22e6-4403-a5f3-9375a8ac4fef-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.602692 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzj5h\" (UniqueName: \"kubernetes.io/projected/11b31489-22e6-4403-a5f3-9375a8ac4fef-kube-api-access-mzj5h\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.630548 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.636839 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.651681 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: E0310 16:11:19.652234 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b31489-22e6-4403-a5f3-9375a8ac4fef" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.652253 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b31489-22e6-4403-a5f3-9375a8ac4fef" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 16:11:20 crc kubenswrapper[4749]: E0310 16:11:19.652275 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-metadata" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.652284 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-metadata" Mar 10 16:11:20 crc kubenswrapper[4749]: E0310 16:11:19.652318 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-log" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.652328 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-log" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.652922 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b31489-22e6-4403-a5f3-9375a8ac4fef" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.652948 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-metadata" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.652961 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" containerName="nova-metadata-log" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.654208 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.656832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.660640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.667693 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.806725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57b1973-d9a3-4695-b38f-a15bdf5ec778-logs\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.806827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.806861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-config-data\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.806894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxzgf\" (UniqueName: \"kubernetes.io/projected/b57b1973-d9a3-4695-b38f-a15bdf5ec778-kube-api-access-gxzgf\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.807030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.909594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-config-data\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.909700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxzgf\" (UniqueName: \"kubernetes.io/projected/b57b1973-d9a3-4695-b38f-a15bdf5ec778-kube-api-access-gxzgf\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.909780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.909936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57b1973-d9a3-4695-b38f-a15bdf5ec778-logs\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.910088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.910356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57b1973-d9a3-4695-b38f-a15bdf5ec778-logs\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.913506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.918908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.919107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-config-data\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.929684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxzgf\" (UniqueName: \"kubernetes.io/projected/b57b1973-d9a3-4695-b38f-a15bdf5ec778-kube-api-access-gxzgf\") pod \"nova-metadata-0\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:19.975073 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.267965 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: W0310 16:11:20.268680 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb57b1973_d9a3_4695_b38f_a15bdf5ec778.slice/crio-9f74f2d0070202c1205e8fa5a2911515a283a292010a675ba09195efb8f43bb7 WatchSource:0}: Error finding container 9f74f2d0070202c1205e8fa5a2911515a283a292010a675ba09195efb8f43bb7: Status 404 returned error can't find the container with id 9f74f2d0070202c1205e8fa5a2911515a283a292010a675ba09195efb8f43bb7 Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.269062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"11b31489-22e6-4403-a5f3-9375a8ac4fef","Type":"ContainerDied","Data":"226f3b33104ddf6e6f97b0e07dd61c3cf411de023055d10b4e47cb1f6e36a9ae"} Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.269103 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.269108 4749 scope.go:117] "RemoveContainer" containerID="98312c49efd18bb55edfdeec5639eeac9104f6971091f342401d0ecddce25c0b" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.330989 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.342498 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.354527 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.357415 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.361322 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.361833 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.361993 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.369132 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.420397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgrfd\" (UniqueName: \"kubernetes.io/projected/b5ba9db0-29a2-468a-ab78-871620e30790-kube-api-access-lgrfd\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.420543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.420576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.420711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.420755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.524287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgrfd\" (UniqueName: \"kubernetes.io/projected/b5ba9db0-29a2-468a-ab78-871620e30790-kube-api-access-lgrfd\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.524589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.524692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.524861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.524929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.528365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.528713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.529912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.530685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.541195 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgrfd\" (UniqueName: \"kubernetes.io/projected/b5ba9db0-29a2-468a-ab78-871620e30790-kube-api-access-lgrfd\") pod \"nova-cell1-novncproxy-0\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:20 crc kubenswrapper[4749]: I0310 16:11:20.685919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.192250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:11:21 crc kubenswrapper[4749]: W0310 16:11:21.195853 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5ba9db0_29a2_468a_ab78_871620e30790.slice/crio-33406fead6089a5d9460b37d43d2b0208cbf151ffbc0eab27176c0dee2182fba WatchSource:0}: Error finding container 33406fead6089a5d9460b37d43d2b0208cbf151ffbc0eab27176c0dee2182fba: Status 404 returned error can't find the container with id 33406fead6089a5d9460b37d43d2b0208cbf151ffbc0eab27176c0dee2182fba Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.279360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b5ba9db0-29a2-468a-ab78-871620e30790","Type":"ContainerStarted","Data":"33406fead6089a5d9460b37d43d2b0208cbf151ffbc0eab27176c0dee2182fba"} Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.281663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b57b1973-d9a3-4695-b38f-a15bdf5ec778","Type":"ContainerStarted","Data":"31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00"} Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.282280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b57b1973-d9a3-4695-b38f-a15bdf5ec778","Type":"ContainerStarted","Data":"c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900"} Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.282363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b57b1973-d9a3-4695-b38f-a15bdf5ec778","Type":"ContainerStarted","Data":"9f74f2d0070202c1205e8fa5a2911515a283a292010a675ba09195efb8f43bb7"} Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.306921 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.306903506 podStartE2EDuration="2.306903506s" podCreationTimestamp="2026-03-10 16:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:21.300059998 +0000 UTC m=+1378.421925705" watchObservedRunningTime="2026-03-10 16:11:21.306903506 +0000 UTC m=+1378.428769193" Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.620447 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b31489-22e6-4403-a5f3-9375a8ac4fef" path="/var/lib/kubelet/pods/11b31489-22e6-4403-a5f3-9375a8ac4fef/volumes" Mar 10 16:11:21 crc kubenswrapper[4749]: I0310 16:11:21.621973 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b99cb14-8007-41ca-9c15-4af7fad17cf1" path="/var/lib/kubelet/pods/5b99cb14-8007-41ca-9c15-4af7fad17cf1/volumes" Mar 10 16:11:22 crc kubenswrapper[4749]: I0310 16:11:22.294778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b5ba9db0-29a2-468a-ab78-871620e30790","Type":"ContainerStarted","Data":"f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae"} Mar 10 16:11:22 crc kubenswrapper[4749]: I0310 16:11:22.320144 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.320122391 podStartE2EDuration="2.320122391s" podCreationTimestamp="2026-03-10 16:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:22.313143428 +0000 UTC m=+1379.435009125" watchObservedRunningTime="2026-03-10 16:11:22.320122391 +0000 UTC m=+1379.441988078" Mar 10 16:11:22 crc kubenswrapper[4749]: I0310 16:11:22.590736 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 16:11:22 crc kubenswrapper[4749]: I0310 16:11:22.591763 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 16:11:22 crc kubenswrapper[4749]: I0310 16:11:22.592953 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 16:11:22 crc kubenswrapper[4749]: I0310 16:11:22.594542 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.307718 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.314036 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.524189 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57db588689-ff8h6"] Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.526687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.544015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db588689-ff8h6"] Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.603732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.603815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5cf\" (UniqueName: \"kubernetes.io/projected/dafd71a4-7276-4bce-84d9-6568e9d38d9d-kube-api-access-bj5cf\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.603860 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-config\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.603877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-svc\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.603911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.603950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-swift-storage-0\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.706857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-config\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.706905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-svc\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.706950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.707008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-swift-storage-0\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.707070 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.707148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5cf\" (UniqueName: \"kubernetes.io/projected/dafd71a4-7276-4bce-84d9-6568e9d38d9d-kube-api-access-bj5cf\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.708020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-config\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.708049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-svc\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.708034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.708107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.708398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-swift-storage-0\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.728478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5cf\" (UniqueName: \"kubernetes.io/projected/dafd71a4-7276-4bce-84d9-6568e9d38d9d-kube-api-access-bj5cf\") pod \"dnsmasq-dns-57db588689-ff8h6\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:23 crc kubenswrapper[4749]: I0310 16:11:23.855939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:24 crc kubenswrapper[4749]: I0310 16:11:24.345115 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db588689-ff8h6"] Mar 10 16:11:24 crc kubenswrapper[4749]: W0310 16:11:24.353929 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddafd71a4_7276_4bce_84d9_6568e9d38d9d.slice/crio-492f5268c4141b5258a1ec3b21588ede536c407a07dc073e190797c9ab0052f7 WatchSource:0}: Error finding container 492f5268c4141b5258a1ec3b21588ede536c407a07dc073e190797c9ab0052f7: Status 404 returned error can't find the container with id 492f5268c4141b5258a1ec3b21588ede536c407a07dc073e190797c9ab0052f7 Mar 10 16:11:24 crc kubenswrapper[4749]: I0310 16:11:24.978573 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 16:11:24 crc kubenswrapper[4749]: I0310 16:11:24.979361 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.328578 4749 generic.go:334] "Generic (PLEG): container finished" podID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerID="fec2e3f9d6052f4d4ff97e50b449c59e790b93ff8700dd853168b392f78e6839" exitCode=0 Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.328866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-ff8h6" event={"ID":"dafd71a4-7276-4bce-84d9-6568e9d38d9d","Type":"ContainerDied","Data":"fec2e3f9d6052f4d4ff97e50b449c59e790b93ff8700dd853168b392f78e6839"} Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.328921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-ff8h6" event={"ID":"dafd71a4-7276-4bce-84d9-6568e9d38d9d","Type":"ContainerStarted","Data":"492f5268c4141b5258a1ec3b21588ede536c407a07dc073e190797c9ab0052f7"} Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.618064 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.618772 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-central-agent" containerID="cri-o://b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7" gracePeriod=30 Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.619055 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="proxy-httpd" containerID="cri-o://97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f" gracePeriod=30 Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.619098 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="sg-core" containerID="cri-o://481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9" gracePeriod=30 Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.619129 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-notification-agent" containerID="cri-o://272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8" gracePeriod=30 Mar 10 16:11:25 crc kubenswrapper[4749]: I0310 16:11:25.686499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.078350 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.338975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-ff8h6" event={"ID":"dafd71a4-7276-4bce-84d9-6568e9d38d9d","Type":"ContainerStarted","Data":"c5c3ca0f09ffdc3b0ca768bffd9853e585b15e8e8d35502eee9fe5cdf2e81621"} Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.340214 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.352923 4749 generic.go:334] "Generic (PLEG): container finished" podID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerID="97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f" exitCode=0 Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.352951 4749 generic.go:334] "Generic (PLEG): container finished" podID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerID="481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9" exitCode=2 Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.352958 4749 generic.go:334] "Generic (PLEG): container finished" podID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerID="b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7" exitCode=0 Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.353093 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-log" containerID="cri-o://6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4" gracePeriod=30 Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.353181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerDied","Data":"97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f"} Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.353215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerDied","Data":"481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9"} Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.353229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerDied","Data":"b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7"} Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.353288 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-api" containerID="cri-o://1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3" gracePeriod=30 Mar 10 16:11:26 crc kubenswrapper[4749]: I0310 16:11:26.374367 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57db588689-ff8h6" podStartSLOduration=3.374343248 podStartE2EDuration="3.374343248s" podCreationTimestamp="2026-03-10 16:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:26.368521217 +0000 UTC m=+1383.490386924" watchObservedRunningTime="2026-03-10 16:11:26.374343248 +0000 UTC m=+1383.496208935" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.334504 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.364258 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerID="6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4" exitCode=143 Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.364350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933","Type":"ContainerDied","Data":"6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4"} Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.367494 4749 generic.go:334] "Generic (PLEG): container finished" podID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerID="272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8" exitCode=0 Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.368645 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.369213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerDied","Data":"272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8"} Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.369248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356bdd1f-5efb-4678-bb76-39d4720e16ba","Type":"ContainerDied","Data":"49b6c34fd018d5b300c83514935a6ebd534395adf134b7766c7f77b3f3dbc3b6"} Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.369268 4749 scope.go:117] "RemoveContainer" containerID="97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.396457 4749 scope.go:117] "RemoveContainer" containerID="481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.417939 4749 scope.go:117] "RemoveContainer" containerID="272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.438315 4749 scope.go:117] "RemoveContainer" containerID="b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.460129 4749 scope.go:117] "RemoveContainer" containerID="97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.460575 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f\": container with ID starting with 97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f not found: ID does not exist" containerID="97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.460606 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f"} err="failed to get container status \"97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f\": rpc error: code = NotFound desc = could not find container \"97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f\": container with ID starting with 97ffe531c091ed348b7afa06530a7a2f977556665000110bd3b00771cb96df2f not found: ID does not exist" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.460624 4749 scope.go:117] "RemoveContainer" containerID="481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.460950 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9\": container with ID starting with 481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9 not found: ID does not exist" containerID="481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.460976 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9"} err="failed to get container status \"481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9\": rpc error: code = NotFound desc = could not find container \"481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9\": container with ID starting with 481f4c3c3298a19c8b7aa10dee7982733390754a42eafc99e12365813a3876f9 not found: ID does not exist" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.460990 4749 scope.go:117] "RemoveContainer" containerID="272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.461242 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8\": container with ID starting with 272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8 not found: ID does not exist" containerID="272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.461267 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8"} err="failed to get container status \"272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8\": rpc error: code = NotFound desc = could not find container \"272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8\": container with ID starting with 272cada3ceb61fc24759f133c47534f4b5a8230ffd3760fc625927da01eec0f8 not found: ID does not exist" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.461287 4749 scope.go:117] "RemoveContainer" containerID="b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.461734 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7\": container with ID starting with b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7 not found: ID does not exist" containerID="b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.461761 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7"} err="failed to get container status \"b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7\": rpc error: code = NotFound desc = could not find container \"b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7\": container with ID starting with b8f772b2003ce768d63b8764a45b2588ebad053a7ac134c0e4169b45b3ad18f7 not found: ID does not exist" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-config-data\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-combined-ca-bundle\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490205 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-ceilometer-tls-certs\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-scripts\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-log-httpd\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-run-httpd\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490388 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgk4z\" (UniqueName: \"kubernetes.io/projected/356bdd1f-5efb-4678-bb76-39d4720e16ba-kube-api-access-pgk4z\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-sg-core-conf-yaml\") pod \"356bdd1f-5efb-4678-bb76-39d4720e16ba\" (UID: \"356bdd1f-5efb-4678-bb76-39d4720e16ba\") " Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.490950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.491328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.491871 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.492011 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356bdd1f-5efb-4678-bb76-39d4720e16ba-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.495623 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-scripts" (OuterVolumeSpecName: "scripts") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.496804 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356bdd1f-5efb-4678-bb76-39d4720e16ba-kube-api-access-pgk4z" (OuterVolumeSpecName: "kube-api-access-pgk4z") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "kube-api-access-pgk4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.525405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.559709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.598172 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.598205 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.598218 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgk4z\" (UniqueName: \"kubernetes.io/projected/356bdd1f-5efb-4678-bb76-39d4720e16ba-kube-api-access-pgk4z\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.598230 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.604082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.605342 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-config-data" (OuterVolumeSpecName: "config-data") pod "356bdd1f-5efb-4678-bb76-39d4720e16ba" (UID: "356bdd1f-5efb-4678-bb76-39d4720e16ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.694364 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.700350 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.700402 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356bdd1f-5efb-4678-bb76-39d4720e16ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.703465 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.721505 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.721938 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-central-agent" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.721962 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-central-agent" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.721978 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-notification-agent" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.721986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-notification-agent" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.722013 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="sg-core" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.722023 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="sg-core" Mar 10 16:11:27 crc kubenswrapper[4749]: E0310 16:11:27.722037 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="proxy-httpd" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.722044 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="proxy-httpd" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.722259 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="sg-core" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.722289 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-notification-agent" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.722303 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="proxy-httpd" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.722315 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" containerName="ceilometer-central-agent" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.724311 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.726354 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.727523 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.730026 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.748932 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.802541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-config-data\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.802805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmbw\" (UniqueName: \"kubernetes.io/projected/3e3d73b4-812e-4486-8467-87c6dfd6ee92-kube-api-access-8nmbw\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.802917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.803071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-log-httpd\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.803165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.803276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-run-httpd\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.803370 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.803479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-scripts\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-config-data\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmbw\" (UniqueName: \"kubernetes.io/projected/3e3d73b4-812e-4486-8467-87c6dfd6ee92-kube-api-access-8nmbw\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-log-httpd\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-run-httpd\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.905916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-scripts\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.906782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-run-httpd\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.906932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-log-httpd\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.910646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.910868 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.910898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-scripts\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.912017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-config-data\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.912124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:27 crc kubenswrapper[4749]: I0310 16:11:27.924028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmbw\" (UniqueName: \"kubernetes.io/projected/3e3d73b4-812e-4486-8467-87c6dfd6ee92-kube-api-access-8nmbw\") pod \"ceilometer-0\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " pod="openstack/ceilometer-0" Mar 10 16:11:28 crc kubenswrapper[4749]: I0310 16:11:28.046865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:11:28 crc kubenswrapper[4749]: I0310 16:11:28.520842 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:11:28 crc kubenswrapper[4749]: W0310 16:11:28.521193 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3d73b4_812e_4486_8467_87c6dfd6ee92.slice/crio-d86b342f87896d3ca2cc4a25ea6baa6a5cf0675797fcb0f0b31cc0e4cc67ad8c WatchSource:0}: Error finding container d86b342f87896d3ca2cc4a25ea6baa6a5cf0675797fcb0f0b31cc0e4cc67ad8c: Status 404 returned error can't find the container with id d86b342f87896d3ca2cc4a25ea6baa6a5cf0675797fcb0f0b31cc0e4cc67ad8c Mar 10 16:11:29 crc kubenswrapper[4749]: I0310 16:11:29.396527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerStarted","Data":"595e6a774c6f1fb6971d897ceee5714bc8b70476939e4b04c3cdfc26a133bd65"} Mar 10 16:11:29 crc kubenswrapper[4749]: I0310 16:11:29.396840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerStarted","Data":"d86b342f87896d3ca2cc4a25ea6baa6a5cf0675797fcb0f0b31cc0e4cc67ad8c"} Mar 10 16:11:29 crc kubenswrapper[4749]: I0310 16:11:29.616290 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356bdd1f-5efb-4678-bb76-39d4720e16ba" path="/var/lib/kubelet/pods/356bdd1f-5efb-4678-bb76-39d4720e16ba/volumes" Mar 10 16:11:29 crc kubenswrapper[4749]: I0310 16:11:29.951718 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:29 crc kubenswrapper[4749]: I0310 16:11:29.975663 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 16:11:29 crc kubenswrapper[4749]: I0310 16:11:29.975996 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.041837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-logs\") pod \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.041886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-combined-ca-bundle\") pod \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.041915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdh6l\" (UniqueName: \"kubernetes.io/projected/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-kube-api-access-zdh6l\") pod \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.041982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-config-data\") pod \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\" (UID: \"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933\") " Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.043446 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-logs" (OuterVolumeSpecName: "logs") pod "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" (UID: "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.052009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-kube-api-access-zdh6l" (OuterVolumeSpecName: "kube-api-access-zdh6l") pod "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" (UID: "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933"). InnerVolumeSpecName "kube-api-access-zdh6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.119891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" (UID: "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.139602 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-config-data" (OuterVolumeSpecName: "config-data") pod "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" (UID: "9ce7ef9a-5d0a-4091-b1b2-03c91c32d933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.145087 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.145240 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.145303 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdh6l\" (UniqueName: \"kubernetes.io/projected/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-kube-api-access-zdh6l\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.145440 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.407068 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerID="1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3" exitCode=0 Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.407136 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.407186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933","Type":"ContainerDied","Data":"1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3"} Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.407250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ce7ef9a-5d0a-4091-b1b2-03c91c32d933","Type":"ContainerDied","Data":"a6fdd3666d75a4faeb60443aafea130153f75d48a6a5d6bf60c64eff934fd3b9"} Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.407272 4749 scope.go:117] "RemoveContainer" containerID="1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.410317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerStarted","Data":"fd96a7ab3a64b263ae86578a46b4ac785d6b725cfd0504260b8b86b6c6c66caa"} Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.441566 4749 scope.go:117] "RemoveContainer" containerID="6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.443571 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.460522 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.472286 4749 scope.go:117] "RemoveContainer" containerID="1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.473897 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:30 crc kubenswrapper[4749]: E0310 16:11:30.474439 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3\": container with ID starting with 1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3 not found: ID does not exist" containerID="1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.474498 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3"} err="failed to get container status \"1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3\": rpc error: code = NotFound desc = could not find container \"1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3\": container with ID starting with 1ddf598384d462b86a96131544a59f47b34d385fc5faec6973c307b30a154fe3 not found: ID does not exist" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.474530 4749 scope.go:117] "RemoveContainer" containerID="6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4" Mar 10 16:11:30 crc kubenswrapper[4749]: E0310 16:11:30.474817 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4\": container with ID starting with 6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4 not found: ID does not exist" containerID="6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.474840 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4"} err="failed to get container status \"6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4\": rpc error: code = NotFound desc = could not find container \"6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4\": container with ID starting with 6c14ddd3cc8d984f0eb0e7a785cc1efc9be9917719ee00504c76feb0015858d4 not found: ID does not exist" Mar 10 16:11:30 crc kubenswrapper[4749]: E0310 16:11:30.475936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-api" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.475957 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-api" Mar 10 16:11:30 crc kubenswrapper[4749]: E0310 16:11:30.475979 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-log" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.475986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-log" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.476161 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-api" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.476195 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" containerName="nova-api-log" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.477187 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.480554 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.480756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.481457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.490911 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.551873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-config-data\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.552157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.552293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.552676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0130e75-0c6d-4d01-aa41-d72ee247ec46-logs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.552722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.552845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvdb\" (UniqueName: \"kubernetes.io/projected/a0130e75-0c6d-4d01-aa41-d72ee247ec46-kube-api-access-sjvdb\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.654285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-config-data\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.654338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.654367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.654468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0130e75-0c6d-4d01-aa41-d72ee247ec46-logs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.654485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.654516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvdb\" (UniqueName: \"kubernetes.io/projected/a0130e75-0c6d-4d01-aa41-d72ee247ec46-kube-api-access-sjvdb\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.656221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0130e75-0c6d-4d01-aa41-d72ee247ec46-logs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.658056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.658296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-config-data\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.659256 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.660081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.672022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvdb\" (UniqueName: \"kubernetes.io/projected/a0130e75-0c6d-4d01-aa41-d72ee247ec46-kube-api-access-sjvdb\") pod \"nova-api-0\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.686161 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.725718 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.799345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.997805 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:30 crc kubenswrapper[4749]: I0310 16:11:30.998074 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.296264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.419595 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerStarted","Data":"a7e54b42d006c7f4c24dab0c52ee76e67b32f801f6a37cf60527b10f12948e8b"} Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.420782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0130e75-0c6d-4d01-aa41-d72ee247ec46","Type":"ContainerStarted","Data":"6a1b00d5f276a3f9c33d1fdc87e3a18432819a8a49cbc3e700ec5af987fa53f7"} Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.442701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.622580 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce7ef9a-5d0a-4091-b1b2-03c91c32d933" path="/var/lib/kubelet/pods/9ce7ef9a-5d0a-4091-b1b2-03c91c32d933/volumes" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.700016 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-cd6gw"] Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.701427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.703536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.704596 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.717911 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cd6gw"] Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.781856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-scripts\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.781980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvzd\" (UniqueName: \"kubernetes.io/projected/98258a79-dfdc-4fd5-be54-d94353ae3fe7-kube-api-access-nvvzd\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.782090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.782125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-config-data\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.883907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-scripts\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.884026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvzd\" (UniqueName: \"kubernetes.io/projected/98258a79-dfdc-4fd5-be54-d94353ae3fe7-kube-api-access-nvvzd\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.884126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.884156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-config-data\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.888981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.889923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-scripts\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.892960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-config-data\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:31 crc kubenswrapper[4749]: I0310 16:11:31.903121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvzd\" (UniqueName: \"kubernetes.io/projected/98258a79-dfdc-4fd5-be54-d94353ae3fe7-kube-api-access-nvvzd\") pod \"nova-cell1-cell-mapping-cd6gw\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:32 crc kubenswrapper[4749]: I0310 16:11:32.018004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:32 crc kubenswrapper[4749]: I0310 16:11:32.433066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0130e75-0c6d-4d01-aa41-d72ee247ec46","Type":"ContainerStarted","Data":"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554"} Mar 10 16:11:32 crc kubenswrapper[4749]: I0310 16:11:32.433509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0130e75-0c6d-4d01-aa41-d72ee247ec46","Type":"ContainerStarted","Data":"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115"} Mar 10 16:11:32 crc kubenswrapper[4749]: I0310 16:11:32.465043 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.465027321 podStartE2EDuration="2.465027321s" podCreationTimestamp="2026-03-10 16:11:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:32.457526864 +0000 UTC m=+1389.579392551" watchObservedRunningTime="2026-03-10 16:11:32.465027321 +0000 UTC m=+1389.586893008" Mar 10 16:11:32 crc kubenswrapper[4749]: I0310 16:11:32.481972 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cd6gw"] Mar 10 16:11:33 crc kubenswrapper[4749]: I0310 16:11:33.452167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cd6gw" event={"ID":"98258a79-dfdc-4fd5-be54-d94353ae3fe7","Type":"ContainerStarted","Data":"72f8a33df83b0d373bbc75a2ec6ea7a30edc86dd30840557b6a3948d5156af59"} Mar 10 16:11:33 crc kubenswrapper[4749]: I0310 16:11:33.452591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cd6gw" event={"ID":"98258a79-dfdc-4fd5-be54-d94353ae3fe7","Type":"ContainerStarted","Data":"133952f5f5c4389d9671130fbed82e9fe345e832e68bd9b254560c205b3caa7b"} Mar 10 16:11:33 crc kubenswrapper[4749]: I0310 16:11:33.476805 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-cd6gw" podStartSLOduration=2.476782116 podStartE2EDuration="2.476782116s" podCreationTimestamp="2026-03-10 16:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:33.465403551 +0000 UTC m=+1390.587269238" watchObservedRunningTime="2026-03-10 16:11:33.476782116 +0000 UTC m=+1390.598647803" Mar 10 16:11:33 crc kubenswrapper[4749]: I0310 16:11:33.858235 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:11:33 crc kubenswrapper[4749]: I0310 16:11:33.946789 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-pfsz9"] Mar 10 16:11:33 crc kubenswrapper[4749]: I0310 16:11:33.947071 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerName="dnsmasq-dns" containerID="cri-o://d98ff629c8a6f49966351adb1d7774cf20efbbd1a83fc5945d726cf4ccbcc436" gracePeriod=10 Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.461735 4749 generic.go:334] "Generic (PLEG): container finished" podID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerID="d98ff629c8a6f49966351adb1d7774cf20efbbd1a83fc5945d726cf4ccbcc436" exitCode=0 Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.461767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" event={"ID":"e30eb114-ece0-4fa1-ba0d-85a33de05463","Type":"ContainerDied","Data":"d98ff629c8a6f49966351adb1d7774cf20efbbd1a83fc5945d726cf4ccbcc436"} Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.462121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" event={"ID":"e30eb114-ece0-4fa1-ba0d-85a33de05463","Type":"ContainerDied","Data":"1dfc49d426645f3c96d6c1d79858829cbd852a40f8e3ff4370a67615ee4565f7"} Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.462133 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfc49d426645f3c96d6c1d79858829cbd852a40f8e3ff4370a67615ee4565f7" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.465872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerStarted","Data":"d4163978450ae5a28c7305f78e151c8b39face70e710fef1aa4e65399f74f360"} Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.466086 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.488986 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.690232983 podStartE2EDuration="7.488967473s" podCreationTimestamp="2026-03-10 16:11:27 +0000 UTC" firstStartedPulling="2026-03-10 16:11:28.523707683 +0000 UTC m=+1385.645573360" lastFinishedPulling="2026-03-10 16:11:33.322442163 +0000 UTC m=+1390.444307850" observedRunningTime="2026-03-10 16:11:34.485855196 +0000 UTC m=+1391.607720913" watchObservedRunningTime="2026-03-10 16:11:34.488967473 +0000 UTC m=+1391.610833170" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.505363 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.650526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8qlk\" (UniqueName: \"kubernetes.io/projected/e30eb114-ece0-4fa1-ba0d-85a33de05463-kube-api-access-b8qlk\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.650609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.650692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-config\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.650831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.650880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-swift-storage-0\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.651017 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-sb\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.656740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30eb114-ece0-4fa1-ba0d-85a33de05463-kube-api-access-b8qlk" (OuterVolumeSpecName: "kube-api-access-b8qlk") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463"). InnerVolumeSpecName "kube-api-access-b8qlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.753502 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8qlk\" (UniqueName: \"kubernetes.io/projected/e30eb114-ece0-4fa1-ba0d-85a33de05463-kube-api-access-b8qlk\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:34 crc kubenswrapper[4749]: E0310 16:11:34.895291 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb podName:e30eb114-ece0-4fa1-ba0d-85a33de05463 nodeName:}" failed. No retries permitted until 2026-03-10 16:11:35.395250234 +0000 UTC m=+1392.517115941 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463") : error deleting /var/lib/kubelet/pods/e30eb114-ece0-4fa1-ba0d-85a33de05463/volume-subpaths: remove /var/lib/kubelet/pods/e30eb114-ece0-4fa1-ba0d-85a33de05463/volume-subpaths: no such file or directory Mar 10 16:11:34 crc kubenswrapper[4749]: E0310 16:11:34.895365 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc podName:e30eb114-ece0-4fa1-ba0d-85a33de05463 nodeName:}" failed. No retries permitted until 2026-03-10 16:11:35.395352107 +0000 UTC m=+1392.517217814 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463") : error deleting /var/lib/kubelet/pods/e30eb114-ece0-4fa1-ba0d-85a33de05463/volume-subpaths: remove /var/lib/kubelet/pods/e30eb114-ece0-4fa1-ba0d-85a33de05463/volume-subpaths: no such file or directory Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.895806 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-config" (OuterVolumeSpecName: "config") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.895823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.895847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.956882 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.956919 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:34 crc kubenswrapper[4749]: I0310 16:11:34.956933 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.465091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.465686 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc\") pod \"e30eb114-ece0-4fa1-ba0d-85a33de05463\" (UID: \"e30eb114-ece0-4fa1-ba0d-85a33de05463\") " Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.466135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.466942 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.469842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e30eb114-ece0-4fa1-ba0d-85a33de05463" (UID: "e30eb114-ece0-4fa1-ba0d-85a33de05463"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.474645 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5695c9cc-pfsz9" Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.513092 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-pfsz9"] Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.524053 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5695c9cc-pfsz9"] Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.569112 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e30eb114-ece0-4fa1-ba0d-85a33de05463-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:35 crc kubenswrapper[4749]: I0310 16:11:35.626563 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" path="/var/lib/kubelet/pods/e30eb114-ece0-4fa1-ba0d-85a33de05463/volumes" Mar 10 16:11:38 crc kubenswrapper[4749]: I0310 16:11:38.505951 4749 generic.go:334] "Generic (PLEG): container finished" podID="98258a79-dfdc-4fd5-be54-d94353ae3fe7" containerID="72f8a33df83b0d373bbc75a2ec6ea7a30edc86dd30840557b6a3948d5156af59" exitCode=0 Mar 10 16:11:38 crc kubenswrapper[4749]: I0310 16:11:38.506038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cd6gw" event={"ID":"98258a79-dfdc-4fd5-be54-d94353ae3fe7","Type":"ContainerDied","Data":"72f8a33df83b0d373bbc75a2ec6ea7a30edc86dd30840557b6a3948d5156af59"} Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.891995 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.955593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-combined-ca-bundle\") pod \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.955814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-scripts\") pod \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.955854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-config-data\") pod \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.955963 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvvzd\" (UniqueName: \"kubernetes.io/projected/98258a79-dfdc-4fd5-be54-d94353ae3fe7-kube-api-access-nvvzd\") pod \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\" (UID: \"98258a79-dfdc-4fd5-be54-d94353ae3fe7\") " Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.962447 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98258a79-dfdc-4fd5-be54-d94353ae3fe7-kube-api-access-nvvzd" (OuterVolumeSpecName: "kube-api-access-nvvzd") pod "98258a79-dfdc-4fd5-be54-d94353ae3fe7" (UID: "98258a79-dfdc-4fd5-be54-d94353ae3fe7"). InnerVolumeSpecName "kube-api-access-nvvzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.963973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-scripts" (OuterVolumeSpecName: "scripts") pod "98258a79-dfdc-4fd5-be54-d94353ae3fe7" (UID: "98258a79-dfdc-4fd5-be54-d94353ae3fe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.984081 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.986037 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.990959 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 16:11:39 crc kubenswrapper[4749]: I0310 16:11:39.998642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-config-data" (OuterVolumeSpecName: "config-data") pod "98258a79-dfdc-4fd5-be54-d94353ae3fe7" (UID: "98258a79-dfdc-4fd5-be54-d94353ae3fe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.020667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98258a79-dfdc-4fd5-be54-d94353ae3fe7" (UID: "98258a79-dfdc-4fd5-be54-d94353ae3fe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.059233 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.059277 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.059292 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98258a79-dfdc-4fd5-be54-d94353ae3fe7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.059304 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvvzd\" (UniqueName: \"kubernetes.io/projected/98258a79-dfdc-4fd5-be54-d94353ae3fe7-kube-api-access-nvvzd\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.535913 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cd6gw" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.536014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cd6gw" event={"ID":"98258a79-dfdc-4fd5-be54-d94353ae3fe7","Type":"ContainerDied","Data":"133952f5f5c4389d9671130fbed82e9fe345e832e68bd9b254560c205b3caa7b"} Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.536090 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133952f5f5c4389d9671130fbed82e9fe345e832e68bd9b254560c205b3caa7b" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.543109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.725217 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.725466 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-log" containerID="cri-o://2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115" gracePeriod=30 Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.725576 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-api" containerID="cri-o://a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554" gracePeriod=30 Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.823024 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.823778 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="45ba94ea-09bc-4752-b93f-edf9ede8b871" containerName="nova-scheduler-scheduler" containerID="cri-o://fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66" gracePeriod=30 Mar 10 16:11:40 crc kubenswrapper[4749]: I0310 16:11:40.850405 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.297367 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.397572 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-internal-tls-certs\") pod \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.397728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-combined-ca-bundle\") pod \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.397754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjvdb\" (UniqueName: \"kubernetes.io/projected/a0130e75-0c6d-4d01-aa41-d72ee247ec46-kube-api-access-sjvdb\") pod \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.397815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-config-data\") pod \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.397831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-public-tls-certs\") pod \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.397911 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0130e75-0c6d-4d01-aa41-d72ee247ec46-logs\") pod \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\" (UID: \"a0130e75-0c6d-4d01-aa41-d72ee247ec46\") " Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.398558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0130e75-0c6d-4d01-aa41-d72ee247ec46-logs" (OuterVolumeSpecName: "logs") pod "a0130e75-0c6d-4d01-aa41-d72ee247ec46" (UID: "a0130e75-0c6d-4d01-aa41-d72ee247ec46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.404166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0130e75-0c6d-4d01-aa41-d72ee247ec46-kube-api-access-sjvdb" (OuterVolumeSpecName: "kube-api-access-sjvdb") pod "a0130e75-0c6d-4d01-aa41-d72ee247ec46" (UID: "a0130e75-0c6d-4d01-aa41-d72ee247ec46"). InnerVolumeSpecName "kube-api-access-sjvdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.430430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0130e75-0c6d-4d01-aa41-d72ee247ec46" (UID: "a0130e75-0c6d-4d01-aa41-d72ee247ec46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.433167 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-config-data" (OuterVolumeSpecName: "config-data") pod "a0130e75-0c6d-4d01-aa41-d72ee247ec46" (UID: "a0130e75-0c6d-4d01-aa41-d72ee247ec46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.472518 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0130e75-0c6d-4d01-aa41-d72ee247ec46" (UID: "a0130e75-0c6d-4d01-aa41-d72ee247ec46"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.501775 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.501827 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjvdb\" (UniqueName: \"kubernetes.io/projected/a0130e75-0c6d-4d01-aa41-d72ee247ec46-kube-api-access-sjvdb\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.501841 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.501852 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0130e75-0c6d-4d01-aa41-d72ee247ec46-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.501864 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.501783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0130e75-0c6d-4d01-aa41-d72ee247ec46" (UID: "a0130e75-0c6d-4d01-aa41-d72ee247ec46"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588375 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerID="a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554" exitCode=0 Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0130e75-0c6d-4d01-aa41-d72ee247ec46","Type":"ContainerDied","Data":"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554"} Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588442 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0130e75-0c6d-4d01-aa41-d72ee247ec46","Type":"ContainerDied","Data":"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115"} Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588487 4749 scope.go:117] "RemoveContainer" containerID="a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588430 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerID="2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115" exitCode=143 Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.588607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0130e75-0c6d-4d01-aa41-d72ee247ec46","Type":"ContainerDied","Data":"6a1b00d5f276a3f9c33d1fdc87e3a18432819a8a49cbc3e700ec5af987fa53f7"} Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.604464 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0130e75-0c6d-4d01-aa41-d72ee247ec46-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.621671 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.625652 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.626858 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.626903 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="45ba94ea-09bc-4752-b93f-edf9ede8b871" containerName="nova-scheduler-scheduler" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.637421 4749 scope.go:117] "RemoveContainer" containerID="2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.644759 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.661182 4749 scope.go:117] "RemoveContainer" containerID="a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.662749 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554\": container with ID starting with a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554 not found: ID does not exist" containerID="a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.662785 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554"} err="failed to get container status \"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554\": rpc error: code = NotFound desc = could not find container \"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554\": container with ID starting with a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554 not found: ID does not exist" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.662822 4749 scope.go:117] "RemoveContainer" containerID="2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.663088 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115\": container with ID starting with 2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115 not found: ID does not exist" containerID="2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.663116 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115"} err="failed to get container status \"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115\": rpc error: code = NotFound desc = could not find container \"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115\": container with ID starting with 2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115 not found: ID does not exist" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.663135 4749 scope.go:117] "RemoveContainer" containerID="a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.663543 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554"} err="failed to get container status \"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554\": rpc error: code = NotFound desc = could not find container \"a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554\": container with ID starting with a0805814fba2c7da029223c0ecf2958acc2d9b71612efbe9f9352e6702bb5554 not found: ID does not exist" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.663566 4749 scope.go:117] "RemoveContainer" containerID="2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.665252 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.674906 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115"} err="failed to get container status \"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115\": rpc error: code = NotFound desc = could not find container \"2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115\": container with ID starting with 2941bdda1d7155ed87bb367dd7671141cc459a39dc3e673cba753e048a94c115 not found: ID does not exist" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.690331 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.690864 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-log" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.690891 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-log" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.690913 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98258a79-dfdc-4fd5-be54-d94353ae3fe7" containerName="nova-manage" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.690922 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="98258a79-dfdc-4fd5-be54-d94353ae3fe7" containerName="nova-manage" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.690952 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerName="init" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.690960 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerName="init" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.690974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-api" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.690980 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-api" Mar 10 16:11:41 crc kubenswrapper[4749]: E0310 16:11:41.690995 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerName="dnsmasq-dns" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.691003 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerName="dnsmasq-dns" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.691219 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30eb114-ece0-4fa1-ba0d-85a33de05463" containerName="dnsmasq-dns" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.691251 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-api" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.691271 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" containerName="nova-api-log" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.691284 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="98258a79-dfdc-4fd5-be54-d94353ae3fe7" containerName="nova-manage" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.692348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.697068 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.697244 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.698341 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.718268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.812634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg9q9\" (UniqueName: \"kubernetes.io/projected/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-kube-api-access-bg9q9\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.812833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.812886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-logs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.813151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.813402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.813578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-config-data\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.915450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.915525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-config-data\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.915554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg9q9\" (UniqueName: \"kubernetes.io/projected/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-kube-api-access-bg9q9\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.915586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.915600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-logs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.915654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.916593 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-logs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.919156 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.919234 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.919610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.919705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-config-data\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:41 crc kubenswrapper[4749]: I0310 16:11:41.934032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg9q9\" (UniqueName: \"kubernetes.io/projected/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-kube-api-access-bg9q9\") pod \"nova-api-0\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " pod="openstack/nova-api-0" Mar 10 16:11:42 crc kubenswrapper[4749]: I0310 16:11:42.022903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:11:42 crc kubenswrapper[4749]: I0310 16:11:42.480490 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:11:42 crc kubenswrapper[4749]: W0310 16:11:42.484081 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba4333fd_3a72_41c7_a82a_448ed0ccfc1e.slice/crio-9ae122f4c1054e59447954e132280768320cc528a2c9fe88938a016bff40cf7a WatchSource:0}: Error finding container 9ae122f4c1054e59447954e132280768320cc528a2c9fe88938a016bff40cf7a: Status 404 returned error can't find the container with id 9ae122f4c1054e59447954e132280768320cc528a2c9fe88938a016bff40cf7a Mar 10 16:11:42 crc kubenswrapper[4749]: I0310 16:11:42.600097 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e","Type":"ContainerStarted","Data":"9ae122f4c1054e59447954e132280768320cc528a2c9fe88938a016bff40cf7a"} Mar 10 16:11:42 crc kubenswrapper[4749]: I0310 16:11:42.600251 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-log" containerID="cri-o://c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900" gracePeriod=30 Mar 10 16:11:42 crc kubenswrapper[4749]: I0310 16:11:42.600361 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-metadata" containerID="cri-o://31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00" gracePeriod=30 Mar 10 16:11:43 crc kubenswrapper[4749]: I0310 16:11:43.617683 4749 generic.go:334] "Generic (PLEG): container finished" podID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerID="c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900" exitCode=143 Mar 10 16:11:43 crc kubenswrapper[4749]: I0310 16:11:43.621360 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0130e75-0c6d-4d01-aa41-d72ee247ec46" path="/var/lib/kubelet/pods/a0130e75-0c6d-4d01-aa41-d72ee247ec46/volumes" Mar 10 16:11:43 crc kubenswrapper[4749]: I0310 16:11:43.621983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e","Type":"ContainerStarted","Data":"98105fccfb28ffdfad3a0b356c5f8eb37b06bcf57c45dd3c31713662a31ddfe5"} Mar 10 16:11:43 crc kubenswrapper[4749]: I0310 16:11:43.622020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e","Type":"ContainerStarted","Data":"15200dc474647db1a4fe0a09c7e300067457404fd74cf484162cb0842079dff1"} Mar 10 16:11:43 crc kubenswrapper[4749]: I0310 16:11:43.622032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b57b1973-d9a3-4695-b38f-a15bdf5ec778","Type":"ContainerDied","Data":"c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900"} Mar 10 16:11:43 crc kubenswrapper[4749]: I0310 16:11:43.677757 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.677722904 podStartE2EDuration="2.677722904s" podCreationTimestamp="2026-03-10 16:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:43.650642456 +0000 UTC m=+1400.772508163" watchObservedRunningTime="2026-03-10 16:11:43.677722904 +0000 UTC m=+1400.799588631" Mar 10 16:11:45 crc kubenswrapper[4749]: I0310 16:11:45.642513 4749 generic.go:334] "Generic (PLEG): container finished" podID="45ba94ea-09bc-4752-b93f-edf9ede8b871" containerID="fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66" exitCode=0 Mar 10 16:11:45 crc kubenswrapper[4749]: I0310 16:11:45.642585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45ba94ea-09bc-4752-b93f-edf9ede8b871","Type":"ContainerDied","Data":"fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66"} Mar 10 16:11:45 crc kubenswrapper[4749]: I0310 16:11:45.663450 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:57732->10.217.0.201:8775: read: connection reset by peer" Mar 10 16:11:45 crc kubenswrapper[4749]: I0310 16:11:45.663450 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:57730->10.217.0.201:8775: read: connection reset by peer" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.198858 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.315285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57b1973-d9a3-4695-b38f-a15bdf5ec778-logs\") pod \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.315347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-combined-ca-bundle\") pod \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.315451 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxzgf\" (UniqueName: \"kubernetes.io/projected/b57b1973-d9a3-4695-b38f-a15bdf5ec778-kube-api-access-gxzgf\") pod \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.315874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-nova-metadata-tls-certs\") pod \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.315933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-config-data\") pod \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\" (UID: \"b57b1973-d9a3-4695-b38f-a15bdf5ec778\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.317049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b57b1973-d9a3-4695-b38f-a15bdf5ec778-logs" (OuterVolumeSpecName: "logs") pod "b57b1973-d9a3-4695-b38f-a15bdf5ec778" (UID: "b57b1973-d9a3-4695-b38f-a15bdf5ec778"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.322096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57b1973-d9a3-4695-b38f-a15bdf5ec778-kube-api-access-gxzgf" (OuterVolumeSpecName: "kube-api-access-gxzgf") pod "b57b1973-d9a3-4695-b38f-a15bdf5ec778" (UID: "b57b1973-d9a3-4695-b38f-a15bdf5ec778"). InnerVolumeSpecName "kube-api-access-gxzgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.345582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-config-data" (OuterVolumeSpecName: "config-data") pod "b57b1973-d9a3-4695-b38f-a15bdf5ec778" (UID: "b57b1973-d9a3-4695-b38f-a15bdf5ec778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.356696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b57b1973-d9a3-4695-b38f-a15bdf5ec778" (UID: "b57b1973-d9a3-4695-b38f-a15bdf5ec778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.390837 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b57b1973-d9a3-4695-b38f-a15bdf5ec778" (UID: "b57b1973-d9a3-4695-b38f-a15bdf5ec778"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.391468 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.421866 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxzgf\" (UniqueName: \"kubernetes.io/projected/b57b1973-d9a3-4695-b38f-a15bdf5ec778-kube-api-access-gxzgf\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.421904 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.421914 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.421923 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57b1973-d9a3-4695-b38f-a15bdf5ec778-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.421933 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57b1973-d9a3-4695-b38f-a15bdf5ec778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.523655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-config-data\") pod \"45ba94ea-09bc-4752-b93f-edf9ede8b871\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.523734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm7xv\" (UniqueName: \"kubernetes.io/projected/45ba94ea-09bc-4752-b93f-edf9ede8b871-kube-api-access-qm7xv\") pod \"45ba94ea-09bc-4752-b93f-edf9ede8b871\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.523766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-combined-ca-bundle\") pod \"45ba94ea-09bc-4752-b93f-edf9ede8b871\" (UID: \"45ba94ea-09bc-4752-b93f-edf9ede8b871\") " Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.527501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ba94ea-09bc-4752-b93f-edf9ede8b871-kube-api-access-qm7xv" (OuterVolumeSpecName: "kube-api-access-qm7xv") pod "45ba94ea-09bc-4752-b93f-edf9ede8b871" (UID: "45ba94ea-09bc-4752-b93f-edf9ede8b871"). InnerVolumeSpecName "kube-api-access-qm7xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.548849 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-config-data" (OuterVolumeSpecName: "config-data") pod "45ba94ea-09bc-4752-b93f-edf9ede8b871" (UID: "45ba94ea-09bc-4752-b93f-edf9ede8b871"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.550390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45ba94ea-09bc-4752-b93f-edf9ede8b871" (UID: "45ba94ea-09bc-4752-b93f-edf9ede8b871"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.625810 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.625841 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm7xv\" (UniqueName: \"kubernetes.io/projected/45ba94ea-09bc-4752-b93f-edf9ede8b871-kube-api-access-qm7xv\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.625851 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ba94ea-09bc-4752-b93f-edf9ede8b871-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.653714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45ba94ea-09bc-4752-b93f-edf9ede8b871","Type":"ContainerDied","Data":"a24acf31768b0d11ee0a66f2ba6fc04e5be01cce73484aad86cb144477e538dd"} Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.653743 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.653766 4749 scope.go:117] "RemoveContainer" containerID="fbd038681ab1812c238db796f0ec51f2c716ef880f7cdfee0cb69680be7a5c66" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.657702 4749 generic.go:334] "Generic (PLEG): container finished" podID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerID="31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00" exitCode=0 Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.657732 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.657747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b57b1973-d9a3-4695-b38f-a15bdf5ec778","Type":"ContainerDied","Data":"31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00"} Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.657774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b57b1973-d9a3-4695-b38f-a15bdf5ec778","Type":"ContainerDied","Data":"9f74f2d0070202c1205e8fa5a2911515a283a292010a675ba09195efb8f43bb7"} Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.702749 4749 scope.go:117] "RemoveContainer" containerID="31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.708922 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.719952 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.729068 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: E0310 16:11:46.730156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-log" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.730182 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-log" Mar 10 16:11:46 crc kubenswrapper[4749]: E0310 16:11:46.730214 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-metadata" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.730224 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-metadata" Mar 10 16:11:46 crc kubenswrapper[4749]: E0310 16:11:46.730267 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ba94ea-09bc-4752-b93f-edf9ede8b871" containerName="nova-scheduler-scheduler" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.730275 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ba94ea-09bc-4752-b93f-edf9ede8b871" containerName="nova-scheduler-scheduler" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.731308 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ba94ea-09bc-4752-b93f-edf9ede8b871" containerName="nova-scheduler-scheduler" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.731423 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-log" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.731463 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" containerName="nova-metadata-metadata" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.738243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.751291 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.764666 4749 scope.go:117] "RemoveContainer" containerID="c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.765484 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.765607 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.777648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.796509 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.801408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.803315 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.803338 4749 scope.go:117] "RemoveContainer" containerID="31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00" Mar 10 16:11:46 crc kubenswrapper[4749]: E0310 16:11:46.803821 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00\": container with ID starting with 31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00 not found: ID does not exist" containerID="31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.803856 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00"} err="failed to get container status \"31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00\": rpc error: code = NotFound desc = could not find container \"31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00\": container with ID starting with 31bfbc9370df77520454e57e145a24575e3cb20e646035c0442c2954c9a82d00 not found: ID does not exist" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.803877 4749 scope.go:117] "RemoveContainer" containerID="c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.803899 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 16:11:46 crc kubenswrapper[4749]: E0310 16:11:46.804215 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900\": container with ID starting with c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900 not found: ID does not exist" containerID="c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.804252 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900"} err="failed to get container status \"c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900\": rpc error: code = NotFound desc = could not find container \"c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900\": container with ID starting with c8bc78a6602473bafbbcb6def7b8eb27054b7820dd75e58a86c507d9d9fd7900 not found: ID does not exist" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.806408 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.830050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.830253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-config-data\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.830352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5w9\" (UniqueName: \"kubernetes.io/projected/46e39f11-450f-43a3-ba72-7c3e8245e382-kube-api-access-bt5w9\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5w9\" (UniqueName: \"kubernetes.io/projected/46e39f11-450f-43a3-ba72-7c3e8245e382-kube-api-access-bt5w9\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932839 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-config-data\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nzm2\" (UniqueName: \"kubernetes.io/projected/d61221be-c05f-47ae-a3b5-80f59d809281-kube-api-access-7nzm2\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61221be-c05f-47ae-a3b5-80f59d809281-logs\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.932947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-config-data\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.936516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-config-data\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.936837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:46 crc kubenswrapper[4749]: I0310 16:11:46.949290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5w9\" (UniqueName: \"kubernetes.io/projected/46e39f11-450f-43a3-ba72-7c3e8245e382-kube-api-access-bt5w9\") pod \"nova-scheduler-0\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " pod="openstack/nova-scheduler-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.034593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.034735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.035524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-config-data\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.035581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nzm2\" (UniqueName: \"kubernetes.io/projected/d61221be-c05f-47ae-a3b5-80f59d809281-kube-api-access-7nzm2\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.035669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61221be-c05f-47ae-a3b5-80f59d809281-logs\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.036031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61221be-c05f-47ae-a3b5-80f59d809281-logs\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.039015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.039387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-config-data\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.051831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.055158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nzm2\" (UniqueName: \"kubernetes.io/projected/d61221be-c05f-47ae-a3b5-80f59d809281-kube-api-access-7nzm2\") pod \"nova-metadata-0\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.091370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.126762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.571031 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.619078 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ba94ea-09bc-4752-b93f-edf9ede8b871" path="/var/lib/kubelet/pods/45ba94ea-09bc-4752-b93f-edf9ede8b871/volumes" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.619930 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57b1973-d9a3-4695-b38f-a15bdf5ec778" path="/var/lib/kubelet/pods/b57b1973-d9a3-4695-b38f-a15bdf5ec778/volumes" Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.665486 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:11:47 crc kubenswrapper[4749]: I0310 16:11:47.689333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46e39f11-450f-43a3-ba72-7c3e8245e382","Type":"ContainerStarted","Data":"1c2ee7b9d83ccd87d32de631e78711c1cca60bf2b3e1c17b69c66a5a0a5001ec"} Mar 10 16:11:48 crc kubenswrapper[4749]: I0310 16:11:48.704852 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46e39f11-450f-43a3-ba72-7c3e8245e382","Type":"ContainerStarted","Data":"c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7"} Mar 10 16:11:48 crc kubenswrapper[4749]: I0310 16:11:48.707356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d61221be-c05f-47ae-a3b5-80f59d809281","Type":"ContainerStarted","Data":"223e73b546b611a128d4581fd3fab7f4ad5f58cffc7f3d629e05eb77a8f22f97"} Mar 10 16:11:48 crc kubenswrapper[4749]: I0310 16:11:48.707426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d61221be-c05f-47ae-a3b5-80f59d809281","Type":"ContainerStarted","Data":"cd77f536ef94e68fc550ef2465958581ff6e48ae27050c48692c33b29d740bde"} Mar 10 16:11:48 crc kubenswrapper[4749]: I0310 16:11:48.707439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d61221be-c05f-47ae-a3b5-80f59d809281","Type":"ContainerStarted","Data":"2ceea8f7d75fd1e2ab9b5b46bb69175d96646dea1f775db99e0e3db2a997c599"} Mar 10 16:11:48 crc kubenswrapper[4749]: I0310 16:11:48.728356 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.72832519 podStartE2EDuration="2.72832519s" podCreationTimestamp="2026-03-10 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:48.727520987 +0000 UTC m=+1405.849386664" watchObservedRunningTime="2026-03-10 16:11:48.72832519 +0000 UTC m=+1405.850190917" Mar 10 16:11:48 crc kubenswrapper[4749]: I0310 16:11:48.766365 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.766345959 podStartE2EDuration="2.766345959s" podCreationTimestamp="2026-03-10 16:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:11:48.753095503 +0000 UTC m=+1405.874961190" watchObservedRunningTime="2026-03-10 16:11:48.766345959 +0000 UTC m=+1405.888211646" Mar 10 16:11:52 crc kubenswrapper[4749]: I0310 16:11:52.025139 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 16:11:52 crc kubenswrapper[4749]: I0310 16:11:52.025506 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 16:11:52 crc kubenswrapper[4749]: I0310 16:11:52.092335 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 16:11:52 crc kubenswrapper[4749]: I0310 16:11:52.127227 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 16:11:52 crc kubenswrapper[4749]: I0310 16:11:52.127283 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 16:11:53 crc kubenswrapper[4749]: I0310 16:11:53.042599 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:53 crc kubenswrapper[4749]: I0310 16:11:53.042887 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:57 crc kubenswrapper[4749]: I0310 16:11:57.092683 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 16:11:57 crc kubenswrapper[4749]: I0310 16:11:57.127625 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 16:11:57 crc kubenswrapper[4749]: I0310 16:11:57.127741 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 16:11:57 crc kubenswrapper[4749]: I0310 16:11:57.132557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 16:11:57 crc kubenswrapper[4749]: I0310 16:11:57.843943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 16:11:58 crc kubenswrapper[4749]: I0310 16:11:58.063922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 16:11:58 crc kubenswrapper[4749]: I0310 16:11:58.149105 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 16:11:58 crc kubenswrapper[4749]: I0310 16:11:58.149317 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.165079 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552652-x8p4n"] Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.167358 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.170058 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.170706 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.170803 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.175279 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-x8p4n"] Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.301197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wmn\" (UniqueName: \"kubernetes.io/projected/e8077178-4215-4f40-8aff-2dd8f4766821-kube-api-access-92wmn\") pod \"auto-csr-approver-29552652-x8p4n\" (UID: \"e8077178-4215-4f40-8aff-2dd8f4766821\") " pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.403246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wmn\" (UniqueName: \"kubernetes.io/projected/e8077178-4215-4f40-8aff-2dd8f4766821-kube-api-access-92wmn\") pod \"auto-csr-approver-29552652-x8p4n\" (UID: \"e8077178-4215-4f40-8aff-2dd8f4766821\") " pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.427920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wmn\" (UniqueName: \"kubernetes.io/projected/e8077178-4215-4f40-8aff-2dd8f4766821-kube-api-access-92wmn\") pod \"auto-csr-approver-29552652-x8p4n\" (UID: \"e8077178-4215-4f40-8aff-2dd8f4766821\") " pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.497439 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:00 crc kubenswrapper[4749]: I0310 16:12:00.960493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-x8p4n"] Mar 10 16:12:01 crc kubenswrapper[4749]: I0310 16:12:01.838346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" event={"ID":"e8077178-4215-4f40-8aff-2dd8f4766821","Type":"ContainerStarted","Data":"4058d884aad83c4a027c0be51cb27420e9c2f29be7280e9c8b723c2ceff481d7"} Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.030405 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.030750 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.031986 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.037825 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.861099 4749 generic.go:334] "Generic (PLEG): container finished" podID="e8077178-4215-4f40-8aff-2dd8f4766821" containerID="696da172e02fe62a97048ef4e054fc5dd19a2680cfd820003727745d5cc14db9" exitCode=0 Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.861270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" event={"ID":"e8077178-4215-4f40-8aff-2dd8f4766821","Type":"ContainerDied","Data":"696da172e02fe62a97048ef4e054fc5dd19a2680cfd820003727745d5cc14db9"} Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.864570 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 16:12:02 crc kubenswrapper[4749]: I0310 16:12:02.879015 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.230610 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.380834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92wmn\" (UniqueName: \"kubernetes.io/projected/e8077178-4215-4f40-8aff-2dd8f4766821-kube-api-access-92wmn\") pod \"e8077178-4215-4f40-8aff-2dd8f4766821\" (UID: \"e8077178-4215-4f40-8aff-2dd8f4766821\") " Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.387562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8077178-4215-4f40-8aff-2dd8f4766821-kube-api-access-92wmn" (OuterVolumeSpecName: "kube-api-access-92wmn") pod "e8077178-4215-4f40-8aff-2dd8f4766821" (UID: "e8077178-4215-4f40-8aff-2dd8f4766821"). InnerVolumeSpecName "kube-api-access-92wmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.483632 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92wmn\" (UniqueName: \"kubernetes.io/projected/e8077178-4215-4f40-8aff-2dd8f4766821-kube-api-access-92wmn\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.887907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" event={"ID":"e8077178-4215-4f40-8aff-2dd8f4766821","Type":"ContainerDied","Data":"4058d884aad83c4a027c0be51cb27420e9c2f29be7280e9c8b723c2ceff481d7"} Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.887970 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4058d884aad83c4a027c0be51cb27420e9c2f29be7280e9c8b723c2ceff481d7" Mar 10 16:12:04 crc kubenswrapper[4749]: I0310 16:12:04.887933 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-x8p4n" Mar 10 16:12:05 crc kubenswrapper[4749]: I0310 16:12:05.315707 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-fcx9s"] Mar 10 16:12:05 crc kubenswrapper[4749]: I0310 16:12:05.323268 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-fcx9s"] Mar 10 16:12:05 crc kubenswrapper[4749]: I0310 16:12:05.621791 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d12a68-0f40-4da5-8662-2228ed4812e4" path="/var/lib/kubelet/pods/31d12a68-0f40-4da5-8662-2228ed4812e4/volumes" Mar 10 16:12:07 crc kubenswrapper[4749]: I0310 16:12:07.131760 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 16:12:07 crc kubenswrapper[4749]: I0310 16:12:07.132977 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 16:12:07 crc kubenswrapper[4749]: I0310 16:12:07.136200 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 16:12:07 crc kubenswrapper[4749]: I0310 16:12:07.928344 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 16:12:20 crc kubenswrapper[4749]: I0310 16:12:20.948603 4749 scope.go:117] "RemoveContainer" containerID="1ad07119a18cdab9ff4030b9d94602e3bbe4d2e3ab6cd5604847ef802f7dbd2c" Mar 10 16:12:24 crc kubenswrapper[4749]: I0310 16:12:24.921429 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 16:12:24 crc kubenswrapper[4749]: I0310 16:12:24.922302 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="bdf02d6a-5794-4b1d-b155-f683bdb8680d" containerName="openstackclient" containerID="cri-o://217e914770081e2bccbbae1cf847983e071573c618440c545a24e7fc9b20a92f" gracePeriod=2 Mar 10 16:12:24 crc kubenswrapper[4749]: I0310 16:12:24.942809 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.002445 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8701-account-create-update-lp5d6"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.023625 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8701-account-create-update-lp5d6"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.056277 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8701-account-create-update-nnbrt"] Mar 10 16:12:25 crc kubenswrapper[4749]: E0310 16:12:25.056747 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf02d6a-5794-4b1d-b155-f683bdb8680d" containerName="openstackclient" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.056759 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf02d6a-5794-4b1d-b155-f683bdb8680d" containerName="openstackclient" Mar 10 16:12:25 crc kubenswrapper[4749]: E0310 16:12:25.056783 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8077178-4215-4f40-8aff-2dd8f4766821" containerName="oc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.056789 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8077178-4215-4f40-8aff-2dd8f4766821" containerName="oc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.056960 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf02d6a-5794-4b1d-b155-f683bdb8680d" containerName="openstackclient" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.056987 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8077178-4215-4f40-8aff-2dd8f4766821" containerName="oc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.057636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.066453 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.083040 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vvxvc"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.084214 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.106815 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.108812 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8701-account-create-update-nnbrt"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.119567 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cr5kc"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.132966 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vvxvc"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.144661 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cr5kc"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.163178 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-116d-account-create-update-66pnv"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.177785 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.178014 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="ovn-northd" containerID="cri-o://ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf" gracePeriod=30 Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.178434 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="openstack-network-exporter" containerID="cri-o://38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81" gracePeriod=30 Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.180723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcjc9\" (UniqueName: \"kubernetes.io/projected/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-kube-api-access-kcjc9\") pod \"root-account-create-update-vvxvc\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.180847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-operator-scripts\") pod \"root-account-create-update-vvxvc\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.180979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783a67d8-9e22-4503-82a6-5f49fb50ee7b-operator-scripts\") pod \"glance-8701-account-create-update-nnbrt\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.183684 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srx6t\" (UniqueName: \"kubernetes.io/projected/783a67d8-9e22-4503-82a6-5f49fb50ee7b-kube-api-access-srx6t\") pod \"glance-8701-account-create-update-nnbrt\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.194437 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-116d-account-create-update-66pnv"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.216432 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-116d-account-create-update-t69qj"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.217931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.222681 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.241357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-116d-account-create-update-t69qj"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.285059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62h5\" (UniqueName: \"kubernetes.io/projected/101ece94-d304-4797-a87d-e7fc8deb6199-kube-api-access-z62h5\") pod \"nova-api-116d-account-create-update-t69qj\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.285351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/101ece94-d304-4797-a87d-e7fc8deb6199-operator-scripts\") pod \"nova-api-116d-account-create-update-t69qj\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.285412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcjc9\" (UniqueName: \"kubernetes.io/projected/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-kube-api-access-kcjc9\") pod \"root-account-create-update-vvxvc\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.285434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-operator-scripts\") pod \"root-account-create-update-vvxvc\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.285498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783a67d8-9e22-4503-82a6-5f49fb50ee7b-operator-scripts\") pod \"glance-8701-account-create-update-nnbrt\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.285531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srx6t\" (UniqueName: \"kubernetes.io/projected/783a67d8-9e22-4503-82a6-5f49fb50ee7b-kube-api-access-srx6t\") pod \"glance-8701-account-create-update-nnbrt\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.286987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-operator-scripts\") pod \"root-account-create-update-vvxvc\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.287148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783a67d8-9e22-4503-82a6-5f49fb50ee7b-operator-scripts\") pod \"glance-8701-account-create-update-nnbrt\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.289444 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-9kdfk"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.348465 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srx6t\" (UniqueName: \"kubernetes.io/projected/783a67d8-9e22-4503-82a6-5f49fb50ee7b-kube-api-access-srx6t\") pod \"glance-8701-account-create-update-nnbrt\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.360680 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-9kdfk"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.360726 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-939f-account-create-update-l8hzh"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.383463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-l8hzh"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.383563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.385091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcjc9\" (UniqueName: \"kubernetes.io/projected/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-kube-api-access-kcjc9\") pod \"root-account-create-update-vvxvc\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.386794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62h5\" (UniqueName: \"kubernetes.io/projected/101ece94-d304-4797-a87d-e7fc8deb6199-kube-api-access-z62h5\") pod \"nova-api-116d-account-create-update-t69qj\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.386824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/101ece94-d304-4797-a87d-e7fc8deb6199-operator-scripts\") pod \"nova-api-116d-account-create-update-t69qj\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.387504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/101ece94-d304-4797-a87d-e7fc8deb6199-operator-scripts\") pod \"nova-api-116d-account-create-update-t69qj\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.399047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.417762 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.432403 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.438479 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ff4lh"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.439211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62h5\" (UniqueName: \"kubernetes.io/projected/101ece94-d304-4797-a87d-e7fc8deb6199-kube-api-access-z62h5\") pod \"nova-api-116d-account-create-update-t69qj\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.488622 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-operator-scripts\") pod \"nova-cell0-939f-account-create-update-l8hzh\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.488936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d59n\" (UniqueName: \"kubernetes.io/projected/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-kube-api-access-4d59n\") pod \"nova-cell0-939f-account-create-update-l8hzh\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.517159 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ff4lh"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.559881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:25 crc kubenswrapper[4749]: E0310 16:12:25.638783 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.199:60876->38.102.83.199:40501: write tcp 38.102.83.199:60876->38.102.83.199:40501: write: broken pipe Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.644252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-operator-scripts\") pod \"nova-cell0-939f-account-create-update-l8hzh\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.645732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-operator-scripts\") pod \"nova-cell0-939f-account-create-update-l8hzh\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.668408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d59n\" (UniqueName: \"kubernetes.io/projected/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-kube-api-access-4d59n\") pod \"nova-cell0-939f-account-create-update-l8hzh\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.724908 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1128619c-2c1a-4589-a013-34444a447036" path="/var/lib/kubelet/pods/1128619c-2c1a-4589-a013-34444a447036/volumes" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.725723 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a" path="/var/lib/kubelet/pods/41d91a5e-e0e0-4ba7-a0d5-36e0ea3d0e0a/volumes" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.726458 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876272e9-3af8-40ba-aac7-40f8cecc909e" path="/var/lib/kubelet/pods/876272e9-3af8-40ba-aac7-40f8cecc909e/volumes" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.727322 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37f7a74-810d-4165-bdb4-eb70e15c4f97" path="/var/lib/kubelet/pods/e37f7a74-810d-4165-bdb4-eb70e15c4f97/volumes" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.761557 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e84f538a-6b0b-44e6-863b-bd06abf880d7" path="/var/lib/kubelet/pods/e84f538a-6b0b-44e6-863b-bd06abf880d7/volumes" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.762222 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.762257 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-lqrp8"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.760577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d59n\" (UniqueName: \"kubernetes.io/projected/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-kube-api-access-4d59n\") pod \"nova-cell0-939f-account-create-update-l8hzh\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.768519 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.778927 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.804436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-lqrp8"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.860416 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.868492 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.869150 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="openstack-network-exporter" containerID="cri-o://36ccb0b67d01f8e0c84de941ec69aa7a4955ba535082448c8b6fccd6ff57bdab" gracePeriod=300 Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.873605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gb86\" (UniqueName: \"kubernetes.io/projected/615e021a-88a2-496f-81a4-46d70e40310d-kube-api-access-5gb86\") pod \"nova-cell1-88fa-account-create-update-lqrp8\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.873720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615e021a-88a2-496f-81a4-46d70e40310d-operator-scripts\") pod \"nova-cell1-88fa-account-create-update-lqrp8\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:25 crc kubenswrapper[4749]: E0310 16:12:25.879470 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 16:12:25 crc kubenswrapper[4749]: E0310 16:12:25.879535 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data podName:1feaa4c9-2cec-45a8-9106-5be885c26eae nodeName:}" failed. No retries permitted until 2026-03-10 16:12:26.379518865 +0000 UTC m=+1443.501384662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data") pod "rabbitmq-server-0" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae") : configmap "rabbitmq-config-data" not found Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.942644 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-964c-account-create-update-vg7qc"] Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.980529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gb86\" (UniqueName: \"kubernetes.io/projected/615e021a-88a2-496f-81a4-46d70e40310d-kube-api-access-5gb86\") pod \"nova-cell1-88fa-account-create-update-lqrp8\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.980834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615e021a-88a2-496f-81a4-46d70e40310d-operator-scripts\") pod \"nova-cell1-88fa-account-create-update-lqrp8\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.981459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615e021a-88a2-496f-81a4-46d70e40310d-operator-scripts\") pod \"nova-cell1-88fa-account-create-update-lqrp8\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:25 crc kubenswrapper[4749]: I0310 16:12:25.992207 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-964c-account-create-update-vg7qc"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.057081 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-tbtx4"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.084716 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-tbtx4"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.102923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gb86\" (UniqueName: \"kubernetes.io/projected/615e021a-88a2-496f-81a4-46d70e40310d-kube-api-access-5gb86\") pod \"nova-cell1-88fa-account-create-update-lqrp8\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.127979 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fptmr"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.136932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.172581 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-sh94m"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.204339 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fptmr"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.253689 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-sh94m"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.261118 4749 generic.go:334] "Generic (PLEG): container finished" podID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerID="38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81" exitCode=2 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.261164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e794ff07-5e05-4d6c-8cc6-64efd90fd91b","Type":"ContainerDied","Data":"38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81"} Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.292423 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c29f-account-create-update-5jmdq"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.324885 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c29f-account-create-update-5jmdq"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.372431 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-wptw6"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.372691 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-wptw6" podUID="8a0229a2-b07d-4baa-8b4c-a1c356e38679" containerName="openstack-network-exporter" containerID="cri-o://45eb6214436b2b1d33093e3aaa79629869ba461dde4208aab226427673052ba4" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.385521 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.386166 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="openstack-network-exporter" containerID="cri-o://ab8c1ce0bf3c8cbe9d4fea8597af8d5906a17d426930351c4a8cef5fb3330560" gracePeriod=300 Mar 10 16:12:26 crc kubenswrapper[4749]: E0310 16:12:26.392164 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 16:12:26 crc kubenswrapper[4749]: E0310 16:12:26.392231 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data podName:1feaa4c9-2cec-45a8-9106-5be885c26eae nodeName:}" failed. No retries permitted until 2026-03-10 16:12:27.392216265 +0000 UTC m=+1444.514081952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data") pod "rabbitmq-server-0" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae") : configmap "rabbitmq-config-data" not found Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.414598 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-bd2hf"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.420089 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vms4g"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.437432 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db588689-ff8h6"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.437691 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57db588689-ff8h6" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="dnsmasq-dns" containerID="cri-o://c5c3ca0f09ffdc3b0ca768bffd9853e585b15e8e8d35502eee9fe5cdf2e81621" gracePeriod=10 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.453264 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-q4tb7"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.471642 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-q4tb7"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.480441 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gwdl7"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.489708 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gwdl7"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.509640 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bk27f"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.523544 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-cd6gw"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.558473 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-cd6gw"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.571739 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bk27f"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.599822 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="ovsdbserver-nb" containerID="cri-o://5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f" gracePeriod=300 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.622892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.632275 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.632493 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api-log" containerID="cri-o://0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.632827 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api" containerID="cri-o://f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.650760 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.650991 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="cinder-scheduler" containerID="cri-o://979fafe5fb14a8e96ba3c95974f251f5e9ed6197a8ab83b091aa994aadb744a2" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.651102 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="probe" containerID="cri-o://bb17514493f3006a9700ec5156a08c7d51cbec38b230b0424cddada1c317646a" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.668318 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8dfcffcf6-962bk"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.668624 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8dfcffcf6-962bk" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-log" containerID="cri-o://e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.668741 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8dfcffcf6-962bk" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-api" containerID="cri-o://57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.722514 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-kxbww"] Mar 10 16:12:26 crc kubenswrapper[4749]: E0310 16:12:26.727894 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:26 crc kubenswrapper[4749]: E0310 16:12:26.727948 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data podName:d34f67ec-ba88-43c9-84af-2c59a2dbbbe3 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:27.227933128 +0000 UTC m=+1444.349798815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data") pod "rabbitmq-cell1-server-0" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3") : configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.743647 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="ovsdbserver-sb" containerID="cri-o://04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b" gracePeriod=300 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.755999 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-kxbww"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.875281 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.883041 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-log" containerID="cri-o://cab2c2597fe3eedc75127b4143e5f1b6bbdc90ba2c1b1f74f9e373f1b0ed0f17" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.883209 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-httpd" containerID="cri-o://5d60d1aa5d24cbc57ff5075376de396d004cbb8a0f8a549e929e2a81a8d75bd4" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.897703 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-l9q7r"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.930507 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-l9q7r"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.944069 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9d56-account-create-update-txqfx"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.951879 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9d56-account-create-update-txqfx"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.963527 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.963853 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-log" containerID="cri-o://c88c00c58bcfe5242271bb002d37c1a1a9cd3e5dd3b4b9465326ba4e737970b2" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.964482 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-httpd" containerID="cri-o://236964c999aec36cddb5fe0239f2b923e3a235f3f2a6498c0a9402202498207b" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.975963 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-658447d949-bwfgt"] Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.976277 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-658447d949-bwfgt" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-api" containerID="cri-o://1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73" gracePeriod=30 Mar 10 16:12:26 crc kubenswrapper[4749]: I0310 16:12:26.976438 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-658447d949-bwfgt" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-httpd" containerID="cri-o://44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001205 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001684 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-server" containerID="cri-o://29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001693 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="swift-recon-cron" containerID="cri-o://bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001793 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="rsync" containerID="cri-o://250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001825 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-expirer" containerID="cri-o://3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001861 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-auditor" containerID="cri-o://f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001872 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-replicator" containerID="cri-o://053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001906 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-server" containerID="cri-o://65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001913 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-updater" containerID="cri-o://939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001934 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-reaper" containerID="cri-o://2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001946 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-auditor" containerID="cri-o://74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001964 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-auditor" containerID="cri-o://419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001981 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-replicator" containerID="cri-o://17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.001992 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-replicator" containerID="cri-o://69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.002016 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-server" containerID="cri-o://9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.002051 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-updater" containerID="cri-o://8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.042505 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-b5l2s"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.068413 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-b5l2s"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.080753 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-35cb-account-create-update-8zfvq"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.107444 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-35cb-account-create-update-8zfvq"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.121237 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vxt5v"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.148085 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vxt5v"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.153271 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8701-account-create-update-nnbrt"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.187591 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f5f78"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.215425 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f5f78"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.243787 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.244024 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-log" containerID="cri-o://15200dc474647db1a4fe0a09c7e300067457404fd74cf484162cb0842079dff1" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.244498 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-api" containerID="cri-o://98105fccfb28ffdfad3a0b356c5f8eb37b06bcf57c45dd3c31713662a31ddfe5" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.258768 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.258824 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data podName:d34f67ec-ba88-43c9-84af-2c59a2dbbbe3 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:28.258810681 +0000 UTC m=+1445.380676368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data") pod "rabbitmq-cell1-server-0" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3") : configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.267522 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qtzqs"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.302079 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerID="0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c" exitCode=143 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.302484 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qtzqs"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.302543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0d845ea-a98a-43ae-9803-30e5d306d29d","Type":"ContainerDied","Data":"0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.315002 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.332101 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hqhhg"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.345499 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hqhhg"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.348981 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349032 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349042 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349050 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349056 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349065 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349072 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.349276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.353666 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-116d-account-create-update-t69qj"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.360954 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.361204 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-log" containerID="cri-o://cd77f536ef94e68fc550ef2465958581ff6e48ae27050c48692c33b29d740bde" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.361676 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-metadata" containerID="cri-o://223e73b546b611a128d4581fd3fab7f4ad5f58cffc7f3d629e05eb77a8f22f97" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.362843 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wptw6_8a0229a2-b07d-4baa-8b4c-a1c356e38679/openstack-network-exporter/0.log" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.362893 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a0229a2-b07d-4baa-8b4c-a1c356e38679" containerID="45eb6214436b2b1d33093e3aaa79629869ba461dde4208aab226427673052ba4" exitCode=2 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.363007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wptw6" event={"ID":"8a0229a2-b07d-4baa-8b4c-a1c356e38679","Type":"ContainerDied","Data":"45eb6214436b2b1d33093e3aaa79629869ba461dde4208aab226427673052ba4"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.366475 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ct548"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.372234 4749 generic.go:334] "Generic (PLEG): container finished" podID="bdf02d6a-5794-4b1d-b155-f683bdb8680d" containerID="217e914770081e2bccbbae1cf847983e071573c618440c545a24e7fc9b20a92f" exitCode=137 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.373917 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ct548"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.383072 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-l8hzh"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.406265 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-d8xlj"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.414166 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-d8xlj"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.417902 4749 generic.go:334] "Generic (PLEG): container finished" podID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerID="c88c00c58bcfe5242271bb002d37c1a1a9cd3e5dd3b4b9465326ba4e737970b2" exitCode=143 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.417971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15480433-b4c2-47c5-a7e4-73395b5bd27d","Type":"ContainerDied","Data":"c88c00c58bcfe5242271bb002d37c1a1a9cd3e5dd3b4b9465326ba4e737970b2"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.425248 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-67dd78ff7-qfbxb"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.425943 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-67dd78ff7-qfbxb" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-httpd" containerID="cri-o://aaeec1a32c2eac40dd562bef0ec6d26e9e7922c02915dd496d085b035d04bd0f" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.426583 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-67dd78ff7-qfbxb" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-server" containerID="cri-o://fb1538a41411d8393adf1f0c90bcd1848a6d9fc8136457b097afbfd0c4663176" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.429135 4749 generic.go:334] "Generic (PLEG): container finished" podID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerID="c5c3ca0f09ffdc3b0ca768bffd9853e585b15e8e8d35502eee9fe5cdf2e81621" exitCode=0 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.429188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-ff8h6" event={"ID":"dafd71a4-7276-4bce-84d9-6568e9d38d9d","Type":"ContainerDied","Data":"c5c3ca0f09ffdc3b0ca768bffd9853e585b15e8e8d35502eee9fe5cdf2e81621"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.457895 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b3814c41-600a-4463-9695-e55c293ffead/ovsdbserver-nb/0.log" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.457943 4749 generic.go:334] "Generic (PLEG): container finished" podID="b3814c41-600a-4463-9695-e55c293ffead" containerID="36ccb0b67d01f8e0c84de941ec69aa7a4955ba535082448c8b6fccd6ff57bdab" exitCode=2 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.457960 4749 generic.go:334] "Generic (PLEG): container finished" podID="b3814c41-600a-4463-9695-e55c293ffead" containerID="5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f" exitCode=143 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.458026 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b3814c41-600a-4463-9695-e55c293ffead","Type":"ContainerDied","Data":"36ccb0b67d01f8e0c84de941ec69aa7a4955ba535082448c8b6fccd6ff57bdab"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.458051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b3814c41-600a-4463-9695-e55c293ffead","Type":"ContainerDied","Data":"5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.458791 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-lqrp8"] Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.460065 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f is running failed: container process not found" containerID="5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.465587 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f is running failed: container process not found" containerID="5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.466908 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.466965 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data podName:1feaa4c9-2cec-45a8-9106-5be885c26eae nodeName:}" failed. No retries permitted until 2026-03-10 16:12:29.466949679 +0000 UTC m=+1446.588815366 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data") pod "rabbitmq-server-0" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae") : configmap "rabbitmq-config-data" not found Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.467262 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-67dc88fb49-f9s7n"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.467484 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-67dc88fb49-f9s7n" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker-log" containerID="cri-o://7ba23112caa9623ef0f1361b165f1a5f82eb24b9e458cf8a3e1047489301781f" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.467761 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-67dc88fb49-f9s7n" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker" containerID="cri-o://eb58db301501949b729719960a2218e0ade2ee5b0f920eebc51295d80357ecf6" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.472282 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f is running failed: container process not found" containerID="5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 16:12:27 crc kubenswrapper[4749]: E0310 16:12:27.472326 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="ovsdbserver-nb" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.477536 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f7b864884-n5l5z"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.477778 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f7b864884-n5l5z" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api-log" containerID="cri-o://8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.478358 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f7b864884-n5l5z" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api" containerID="cri-o://a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.478545 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cc64163-530a-4b31-9acc-84910336b781" containerID="e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2" exitCode=143 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.478586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8dfcffcf6-962bk" event={"ID":"7cc64163-530a-4b31-9acc-84910336b781","Type":"ContainerDied","Data":"e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.486274 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-94bd49868-nj59v"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.486512 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener-log" containerID="cri-o://9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.486699 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener" containerID="cri-o://9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.490496 4749 generic.go:334] "Generic (PLEG): container finished" podID="1b598099-b3f7-4157-8e5f-6eb472806511" containerID="cab2c2597fe3eedc75127b4143e5f1b6bbdc90ba2c1b1f74f9e373f1b0ed0f17" exitCode=143 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.490561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1b598099-b3f7-4157-8e5f-6eb472806511","Type":"ContainerDied","Data":"cab2c2597fe3eedc75127b4143e5f1b6bbdc90ba2c1b1f74f9e373f1b0ed0f17"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.494544 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.494780 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b5ba9db0-29a2-468a-ab78-871620e30790" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.506301 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.511192 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_99aedb1b-bca3-41ef-9399-4678f86ac87c/ovsdbserver-sb/0.log" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.511239 4749 generic.go:334] "Generic (PLEG): container finished" podID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerID="ab8c1ce0bf3c8cbe9d4fea8597af8d5906a17d426930351c4a8cef5fb3330560" exitCode=2 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.511255 4749 generic.go:334] "Generic (PLEG): container finished" podID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerID="04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b" exitCode=143 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.511277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"99aedb1b-bca3-41ef-9399-4678f86ac87c","Type":"ContainerDied","Data":"ab8c1ce0bf3c8cbe9d4fea8597af8d5906a17d426930351c4a8cef5fb3330560"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.511305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"99aedb1b-bca3-41ef-9399-4678f86ac87c","Type":"ContainerDied","Data":"04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b"} Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.514196 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.684964 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="rabbitmq" containerID="cri-o://3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a" gracePeriod=604800 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.710926 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d152df-3150-4e52-ac41-1288d89383c2" path="/var/lib/kubelet/pods/11d152df-3150-4e52-ac41-1288d89383c2/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.712286 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f63562-8b95-41ed-92c8-f7a215854065" path="/var/lib/kubelet/pods/16f63562-8b95-41ed-92c8-f7a215854065/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.712894 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e966eb8-aa23-4b7a-8477-1e6e321054f9" path="/var/lib/kubelet/pods/1e966eb8-aa23-4b7a-8477-1e6e321054f9/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.713565 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0ac736-fc91-42f3-a2af-564141a88227" path="/var/lib/kubelet/pods/1f0ac736-fc91-42f3-a2af-564141a88227/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.718561 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b118176-15b4-4d8c-a2d4-8bc3e53dcd60" path="/var/lib/kubelet/pods/2b118176-15b4-4d8c-a2d4-8bc3e53dcd60/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.719190 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46585eed-121a-4500-a26b-70bddeeeb075" path="/var/lib/kubelet/pods/46585eed-121a-4500-a26b-70bddeeeb075/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.719817 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489f0d93-0a70-476e-b7ab-7db40933bf88" path="/var/lib/kubelet/pods/489f0d93-0a70-476e-b7ab-7db40933bf88/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.748611 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4d52fc-2652-4251-afea-b3d1e39ed0f3" path="/var/lib/kubelet/pods/5d4d52fc-2652-4251-afea-b3d1e39ed0f3/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.749779 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709acaab-3856-4321-8076-f615a144105d" path="/var/lib/kubelet/pods/709acaab-3856-4321-8076-f615a144105d/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.750642 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ccda46-f693-4e5e-82e9-a874cafbceb8" path="/var/lib/kubelet/pods/72ccda46-f693-4e5e-82e9-a874cafbceb8/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.754574 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5d1831-eae7-4ede-a37d-158ef6140d54" path="/var/lib/kubelet/pods/7a5d1831-eae7-4ede-a37d-158ef6140d54/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.785926 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c820403-de63-4498-b9b9-f9881586293a" path="/var/lib/kubelet/pods/8c820403-de63-4498-b9b9-f9881586293a/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.790224 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f657e52-1b31-417c-8cf2-093bd5c6b8f2" path="/var/lib/kubelet/pods/8f657e52-1b31-417c-8cf2-093bd5c6b8f2/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.791048 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6aaf20-62e0-47eb-b54d-6edbdf95e770" path="/var/lib/kubelet/pods/8f6aaf20-62e0-47eb-b54d-6edbdf95e770/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.791084 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="rabbitmq" containerID="cri-o://26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d" gracePeriod=604800 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.791826 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98258a79-dfdc-4fd5-be54-d94353ae3fe7" path="/var/lib/kubelet/pods/98258a79-dfdc-4fd5-be54-d94353ae3fe7/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.794195 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1424b4e-9b0e-4108-81e0-6adcd7ec34cc" path="/var/lib/kubelet/pods/b1424b4e-9b0e-4108-81e0-6adcd7ec34cc/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.795127 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68ebbd2-399e-4e9d-938d-c3209c46d76a" path="/var/lib/kubelet/pods/b68ebbd2-399e-4e9d-938d-c3209c46d76a/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.796473 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88b3e71-b8ae-44cf-a104-3236bc27a87f" path="/var/lib/kubelet/pods/d88b3e71-b8ae-44cf-a104-3236bc27a87f/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.797439 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3f2f368-abe2-4fe8-835e-8c60a954ab97" path="/var/lib/kubelet/pods/e3f2f368-abe2-4fe8-835e-8c60a954ab97/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.798952 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f643319b-ae93-4b48-b10d-7e7f5f27a7c6" path="/var/lib/kubelet/pods/f643319b-ae93-4b48-b10d-7e7f5f27a7c6/volumes" Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.800790 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sb5c8"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.800839 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.800883 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sb5c8"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.800925 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6g4w"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.801231 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e80985ef-0a5d-403a-b351-c59bd878723d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.814216 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.828453 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.832791 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerName="galera" containerID="cri-o://0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f" gracePeriod=30 Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.840423 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c6g4w"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.860091 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:12:27 crc kubenswrapper[4749]: I0310 16:12:27.892640 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46e39f11-450f-43a3-ba72-7c3e8245e382" containerName="nova-scheduler-scheduler" containerID="cri-o://c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" gracePeriod=30 Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.045749 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.047229 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.056732 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.056807 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerName="nova-cell0-conductor-conductor" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.126740 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" containerID="cri-o://524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" gracePeriod=29 Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.203676 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01351004_ea7d_4973_9dd2_859022a35edb.slice/crio-bb17514493f3006a9700ec5156a08c7d51cbec38b230b0424cddada1c317646a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd8a90f3_a6d3_428e_a049_78cb36e2ed34.slice/crio-8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852c97ea_349d_4262_b36c_2ef7aa81ae75.slice/crio-aaeec1a32c2eac40dd562bef0ec6d26e9e7922c02915dd496d085b035d04bd0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852c97ea_349d_4262_b36c_2ef7aa81ae75.slice/crio-fb1538a41411d8393adf1f0c90bcd1848a6d9fc8136457b097afbfd0c4663176.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d50314_7d2d_4d92_9a78_846a573a3000.slice/crio-conmon-65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9.scope\": RecentStats: unable to find data in memory cache]" Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.345576 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.345645 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data podName:d34f67ec-ba88-43c9-84af-2c59a2dbbbe3 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:30.345627508 +0000 UTC m=+1447.467493195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data") pod "rabbitmq-cell1-server-0" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3") : configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.426491 4749 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 16:12:28 crc kubenswrapper[4749]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 16:12:28 crc kubenswrapper[4749]: + source /usr/local/bin/container-scripts/functions Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNBridge=br-int Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNRemote=tcp:localhost:6642 Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNEncapType=geneve Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNAvailabilityZones= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ EnableChassisAsGateway=true Mar 10 16:12:28 crc kubenswrapper[4749]: ++ PhysicalNetworks= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNHostName= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 16:12:28 crc kubenswrapper[4749]: ++ ovs_dir=/var/lib/openvswitch Mar 10 16:12:28 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 16:12:28 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 16:12:28 crc kubenswrapper[4749]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + cleanup_ovsdb_server_semaphore Mar 10 16:12:28 crc kubenswrapper[4749]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 16:12:28 crc kubenswrapper[4749]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 16:12:28 crc kubenswrapper[4749]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-bd2hf" message=< Mar 10 16:12:28 crc kubenswrapper[4749]: Exiting ovsdb-server (5) [ OK ] Mar 10 16:12:28 crc kubenswrapper[4749]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 16:12:28 crc kubenswrapper[4749]: + source /usr/local/bin/container-scripts/functions Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNBridge=br-int Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNRemote=tcp:localhost:6642 Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNEncapType=geneve Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNAvailabilityZones= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ EnableChassisAsGateway=true Mar 10 16:12:28 crc kubenswrapper[4749]: ++ PhysicalNetworks= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNHostName= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 16:12:28 crc kubenswrapper[4749]: ++ ovs_dir=/var/lib/openvswitch Mar 10 16:12:28 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 16:12:28 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 16:12:28 crc kubenswrapper[4749]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + cleanup_ovsdb_server_semaphore Mar 10 16:12:28 crc kubenswrapper[4749]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 16:12:28 crc kubenswrapper[4749]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 16:12:28 crc kubenswrapper[4749]: > Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.426820 4749 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 16:12:28 crc kubenswrapper[4749]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 16:12:28 crc kubenswrapper[4749]: + source /usr/local/bin/container-scripts/functions Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNBridge=br-int Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNRemote=tcp:localhost:6642 Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNEncapType=geneve Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNAvailabilityZones= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ EnableChassisAsGateway=true Mar 10 16:12:28 crc kubenswrapper[4749]: ++ PhysicalNetworks= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ OVNHostName= Mar 10 16:12:28 crc kubenswrapper[4749]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 16:12:28 crc kubenswrapper[4749]: ++ ovs_dir=/var/lib/openvswitch Mar 10 16:12:28 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 16:12:28 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 16:12:28 crc kubenswrapper[4749]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + sleep 0.5 Mar 10 16:12:28 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 16:12:28 crc kubenswrapper[4749]: + cleanup_ovsdb_server_semaphore Mar 10 16:12:28 crc kubenswrapper[4749]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 16:12:28 crc kubenswrapper[4749]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 16:12:28 crc kubenswrapper[4749]: > pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" containerID="cri-o://091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.426852 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" containerID="cri-o://091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" gracePeriod=28 Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.534730 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b is running failed: container process not found" containerID="04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.542517 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b is running failed: container process not found" containerID="04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.543707 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b is running failed: container process not found" containerID="04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 10 16:12:28 crc kubenswrapper[4749]: E0310 16:12:28.543737 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="ovsdbserver-sb" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.544803 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02e4e4260cd0292c4be2ba0bc26efa024a8a4b47215ad8b9de889654cec14fcf" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.547926 4749 generic.go:334] "Generic (PLEG): container finished" podID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerID="fb1538a41411d8393adf1f0c90bcd1848a6d9fc8136457b097afbfd0c4663176" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.547946 4749 generic.go:334] "Generic (PLEG): container finished" podID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerID="aaeec1a32c2eac40dd562bef0ec6d26e9e7922c02915dd496d085b035d04bd0f" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.548016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67dd78ff7-qfbxb" event={"ID":"852c97ea-349d-4262-b36c-2ef7aa81ae75","Type":"ContainerDied","Data":"fb1538a41411d8393adf1f0c90bcd1848a6d9fc8136457b097afbfd0c4663176"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.548071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67dd78ff7-qfbxb" event={"ID":"852c97ea-349d-4262-b36c-2ef7aa81ae75","Type":"ContainerDied","Data":"aaeec1a32c2eac40dd562bef0ec6d26e9e7922c02915dd496d085b035d04bd0f"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.555764 4749 generic.go:334] "Generic (PLEG): container finished" podID="d61221be-c05f-47ae-a3b5-80f59d809281" containerID="cd77f536ef94e68fc550ef2465958581ff6e48ae27050c48692c33b29d740bde" exitCode=143 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.555853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d61221be-c05f-47ae-a3b5-80f59d809281","Type":"ContainerDied","Data":"cd77f536ef94e68fc550ef2465958581ff6e48ae27050c48692c33b29d740bde"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.575016 4749 generic.go:334] "Generic (PLEG): container finished" podID="01351004-ea7d-4973-9dd2-859022a35edb" containerID="bb17514493f3006a9700ec5156a08c7d51cbec38b230b0424cddada1c317646a" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.575081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"01351004-ea7d-4973-9dd2-859022a35edb","Type":"ContainerDied","Data":"bb17514493f3006a9700ec5156a08c7d51cbec38b230b0424cddada1c317646a"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.580847 4749 generic.go:334] "Generic (PLEG): container finished" podID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerID="eb58db301501949b729719960a2218e0ade2ee5b0f920eebc51295d80357ecf6" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.580881 4749 generic.go:334] "Generic (PLEG): container finished" podID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerID="7ba23112caa9623ef0f1361b165f1a5f82eb24b9e458cf8a3e1047489301781f" exitCode=143 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.580929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67dc88fb49-f9s7n" event={"ID":"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3","Type":"ContainerDied","Data":"eb58db301501949b729719960a2218e0ade2ee5b0f920eebc51295d80357ecf6"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.580957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67dc88fb49-f9s7n" event={"ID":"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3","Type":"ContainerDied","Data":"7ba23112caa9623ef0f1361b165f1a5f82eb24b9e458cf8a3e1047489301781f"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.584010 4749 generic.go:334] "Generic (PLEG): container finished" podID="236aa9f6-5238-45de-813d-e0b18c887f64" containerID="44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.584048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-658447d949-bwfgt" event={"ID":"236aa9f6-5238-45de-813d-e0b18c887f64","Type":"ContainerDied","Data":"44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.596942 4749 generic.go:334] "Generic (PLEG): container finished" podID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerID="8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee" exitCode=143 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.597037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b864884-n5l5z" event={"ID":"fd8a90f3-a6d3-428e-a049-78cb36e2ed34","Type":"ContainerDied","Data":"8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.604739 4749 generic.go:334] "Generic (PLEG): container finished" podID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerID="15200dc474647db1a4fe0a09c7e300067457404fd74cf484162cb0842079dff1" exitCode=143 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.604846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e","Type":"ContainerDied","Data":"15200dc474647db1a4fe0a09c7e300067457404fd74cf484162cb0842079dff1"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.610585 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b3814c41-600a-4463-9695-e55c293ffead/ovsdbserver-nb/0.log" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.610737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b3814c41-600a-4463-9695-e55c293ffead","Type":"ContainerDied","Data":"b89cd3e50cda5065b116740ac396cc4ed54a24a5d1be39eeb02a4c343838d19d"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.610767 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b89cd3e50cda5065b116740ac396cc4ed54a24a5d1be39eeb02a4c343838d19d" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.630447 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_99aedb1b-bca3-41ef-9399-4678f86ac87c/ovsdbserver-sb/0.log" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.630567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"99aedb1b-bca3-41ef-9399-4678f86ac87c","Type":"ContainerDied","Data":"bd7b870a906040d745505efde228d65a6fa5f2974b6032afdb234dab118d0aa3"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.630594 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd7b870a906040d745505efde228d65a6fa5f2974b6032afdb234dab118d0aa3" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.660153 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wptw6_8a0229a2-b07d-4baa-8b4c-a1c356e38679/openstack-network-exporter/0.log" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.660281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-wptw6" event={"ID":"8a0229a2-b07d-4baa-8b4c-a1c356e38679","Type":"ContainerDied","Data":"3296c78784ffeff1936d437132fa977e27a916744d5029bf253445b5784c62ce"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.660309 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3296c78784ffeff1936d437132fa977e27a916744d5029bf253445b5784c62ce" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.693839 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerID="9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb" exitCode=143 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.693935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" event={"ID":"7e75ef50-1c0b-498e-8448-39a7c8912f96","Type":"ContainerDied","Data":"9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.700932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db588689-ff8h6" event={"ID":"dafd71a4-7276-4bce-84d9-6568e9d38d9d","Type":"ContainerDied","Data":"492f5268c4141b5258a1ec3b21588ede536c407a07dc073e190797c9ab0052f7"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.700957 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="492f5268c4141b5258a1ec3b21588ede536c407a07dc073e190797c9ab0052f7" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742712 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742747 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742758 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742767 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742774 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742784 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742791 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.742923 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.750790 4749 generic.go:334] "Generic (PLEG): container finished" podID="e03a8285-2164-42a8-8887-95bdaf021a73" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" exitCode=0 Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.750862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerDied","Data":"091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f"} Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.928718 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-wptw6_8a0229a2-b07d-4baa-8b4c-a1c356e38679/openstack-network-exporter/0.log" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.929052 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.944531 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.944561 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_99aedb1b-bca3-41ef-9399-4678f86ac87c/ovsdbserver-sb/0.log" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.944642 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.965827 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.969706 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b3814c41-600a-4463-9695-e55c293ffead/ovsdbserver-nb/0.log" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.969801 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.985202 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:12:28 crc kubenswrapper[4749]: I0310 16:12:28.990430 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.002165 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-log-httpd\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-config\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-scripts\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6h96\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-kube-api-access-r6h96\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059467 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config-secret\") pod \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059502 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj5cf\" (UniqueName: \"kubernetes.io/projected/dafd71a4-7276-4bce-84d9-6568e9d38d9d-kube-api-access-bj5cf\") pod \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-combined-ca-bundle\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-combined-ca-bundle\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3814c41-600a-4463-9695-e55c293ffead-ovsdb-rundir\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-run-httpd\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-swift-storage-0\") pod \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059698 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovn-rundir\") pod \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-config\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059743 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0229a2-b07d-4baa-8b4c-a1c356e38679-config\") pod \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-metrics-certs-tls-certs\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-etc-swift\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config\") pod \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-svc\") pod \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovs-rundir\") pod \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-metrics-certs-tls-certs\") pod \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.059967 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-ovsdbserver-nb-tls-certs\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdbserver-sb-tls-certs\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060024 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5qm\" (UniqueName: \"kubernetes.io/projected/b3814c41-600a-4463-9695-e55c293ffead-kube-api-access-qg5qm\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060028 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060050 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdb-rundir\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-combined-ca-bundle\") pod \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-nb\") pod \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-public-tls-certs\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060157 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-internal-tls-certs\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-config" (OuterVolumeSpecName: "config") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwshh\" (UniqueName: \"kubernetes.io/projected/99aedb1b-bca3-41ef-9399-4678f86ac87c-kube-api-access-lwshh\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-sb\") pod \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-metrics-certs-tls-certs\") pod \"b3814c41-600a-4463-9695-e55c293ffead\" (UID: \"b3814c41-600a-4463-9695-e55c293ffead\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060326 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-combined-ca-bundle\") pod \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-config" (OuterVolumeSpecName: "config") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmx2k\" (UniqueName: \"kubernetes.io/projected/bdf02d6a-5794-4b1d-b155-f683bdb8680d-kube-api-access-hmx2k\") pod \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\" (UID: \"bdf02d6a-5794-4b1d-b155-f683bdb8680d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-config\") pod \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\" (UID: \"dafd71a4-7276-4bce-84d9-6568e9d38d9d\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060451 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x99t7\" (UniqueName: \"kubernetes.io/projected/8a0229a2-b07d-4baa-8b4c-a1c356e38679-kube-api-access-x99t7\") pod \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\" (UID: \"8a0229a2-b07d-4baa-8b4c-a1c356e38679\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-combined-ca-bundle\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060516 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-scripts\") pod \"99aedb1b-bca3-41ef-9399-4678f86ac87c\" (UID: \"99aedb1b-bca3-41ef-9399-4678f86ac87c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-config-data\") pod \"852c97ea-349d-4262-b36c-2ef7aa81ae75\" (UID: \"852c97ea-349d-4262-b36c-2ef7aa81ae75\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.060801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-scripts" (OuterVolumeSpecName: "scripts") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.061038 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.061055 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.061067 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3814c41-600a-4463-9695-e55c293ffead-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.061078 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.066250 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-kube-api-access-r6h96" (OuterVolumeSpecName: "kube-api-access-r6h96") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "kube-api-access-r6h96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.066890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3814c41-600a-4463-9695-e55c293ffead-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.067141 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.068630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8a0229a2-b07d-4baa-8b4c-a1c356e38679" (UID: "8a0229a2-b07d-4baa-8b4c-a1c356e38679"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.068684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "8a0229a2-b07d-4baa-8b4c-a1c356e38679" (UID: "8a0229a2-b07d-4baa-8b4c-a1c356e38679"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.083903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-scripts" (OuterVolumeSpecName: "scripts") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.087910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0229a2-b07d-4baa-8b4c-a1c356e38679-config" (OuterVolumeSpecName: "config") pod "8a0229a2-b07d-4baa-8b4c-a1c356e38679" (UID: "8a0229a2-b07d-4baa-8b4c-a1c356e38679"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.088657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.092198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafd71a4-7276-4bce-84d9-6568e9d38d9d-kube-api-access-bj5cf" (OuterVolumeSpecName: "kube-api-access-bj5cf") pod "dafd71a4-7276-4bce-84d9-6568e9d38d9d" (UID: "dafd71a4-7276-4bce-84d9-6568e9d38d9d"). InnerVolumeSpecName "kube-api-access-bj5cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.103980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.104307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3814c41-600a-4463-9695-e55c293ffead-kube-api-access-qg5qm" (OuterVolumeSpecName: "kube-api-access-qg5qm") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "kube-api-access-qg5qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.106813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0229a2-b07d-4baa-8b4c-a1c356e38679-kube-api-access-x99t7" (OuterVolumeSpecName: "kube-api-access-x99t7") pod "8a0229a2-b07d-4baa-8b4c-a1c356e38679" (UID: "8a0229a2-b07d-4baa-8b4c-a1c356e38679"). InnerVolumeSpecName "kube-api-access-x99t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.112769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.112822 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.112917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99aedb1b-bca3-41ef-9399-4678f86ac87c-kube-api-access-lwshh" (OuterVolumeSpecName: "kube-api-access-lwshh") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "kube-api-access-lwshh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.113050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf02d6a-5794-4b1d-b155-f683bdb8680d-kube-api-access-hmx2k" (OuterVolumeSpecName: "kube-api-access-hmx2k") pod "bdf02d6a-5794-4b1d-b155-f683bdb8680d" (UID: "bdf02d6a-5794-4b1d-b155-f683bdb8680d"). InnerVolumeSpecName "kube-api-access-hmx2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.157366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bdf02d6a-5794-4b1d-b155-f683bdb8680d" (UID: "bdf02d6a-5794-4b1d-b155-f683bdb8680d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.162174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x87zp\" (UniqueName: \"kubernetes.io/projected/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-kube-api-access-x87zp\") pod \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.162267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-combined-ca-bundle\") pod \"7e75ef50-1c0b-498e-8448-39a7c8912f96\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.163331 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data-custom\") pod \"7e75ef50-1c0b-498e-8448-39a7c8912f96\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.163421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data-custom\") pod \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.165718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e75ef50-1c0b-498e-8448-39a7c8912f96-logs\") pod \"7e75ef50-1c0b-498e-8448-39a7c8912f96\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.165810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-logs\") pod \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.165856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-combined-ca-bundle\") pod \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.165874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data\") pod \"7e75ef50-1c0b-498e-8448-39a7c8912f96\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.165914 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqg75\" (UniqueName: \"kubernetes.io/projected/7e75ef50-1c0b-498e-8448-39a7c8912f96-kube-api-access-xqg75\") pod \"7e75ef50-1c0b-498e-8448-39a7c8912f96\" (UID: \"7e75ef50-1c0b-498e-8448-39a7c8912f96\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.165930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data\") pod \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\" (UID: \"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166778 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99aedb1b-bca3-41ef-9399-4678f86ac87c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166807 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6h96\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-kube-api-access-r6h96\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166819 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj5cf\" (UniqueName: \"kubernetes.io/projected/dafd71a4-7276-4bce-84d9-6568e9d38d9d-kube-api-access-bj5cf\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166827 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b3814c41-600a-4463-9695-e55c293ffead-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166837 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/852c97ea-349d-4262-b36c-2ef7aa81ae75-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166846 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166854 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0229a2-b07d-4baa-8b4c-a1c356e38679-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166863 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852c97ea-349d-4262-b36c-2ef7aa81ae75-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166872 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166881 4749 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a0229a2-b07d-4baa-8b4c-a1c356e38679-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166891 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5qm\" (UniqueName: \"kubernetes.io/projected/b3814c41-600a-4463-9695-e55c293ffead-kube-api-access-qg5qm\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166901 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166910 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwshh\" (UniqueName: \"kubernetes.io/projected/99aedb1b-bca3-41ef-9399-4678f86ac87c-kube-api-access-lwshh\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166932 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166947 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166957 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmx2k\" (UniqueName: \"kubernetes.io/projected/bdf02d6a-5794-4b1d-b155-f683bdb8680d-kube-api-access-hmx2k\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.166967 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x99t7\" (UniqueName: \"kubernetes.io/projected/8a0229a2-b07d-4baa-8b4c-a1c356e38679-kube-api-access-x99t7\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.167090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e75ef50-1c0b-498e-8448-39a7c8912f96-logs" (OuterVolumeSpecName: "logs") pod "7e75ef50-1c0b-498e-8448-39a7c8912f96" (UID: "7e75ef50-1c0b-498e-8448-39a7c8912f96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.168447 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-logs" (OuterVolumeSpecName: "logs") pod "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" (UID: "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.175522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-kube-api-access-x87zp" (OuterVolumeSpecName: "kube-api-access-x87zp") pod "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" (UID: "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3"). InnerVolumeSpecName "kube-api-access-x87zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.177228 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" (UID: "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.178507 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e75ef50-1c0b-498e-8448-39a7c8912f96-kube-api-access-xqg75" (OuterVolumeSpecName: "kube-api-access-xqg75") pod "7e75ef50-1c0b-498e-8448-39a7c8912f96" (UID: "7e75ef50-1c0b-498e-8448-39a7c8912f96"). InnerVolumeSpecName "kube-api-access-xqg75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.187584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e75ef50-1c0b-498e-8448-39a7c8912f96" (UID: "7e75ef50-1c0b-498e-8448-39a7c8912f96"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.189412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.218584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a0229a2-b07d-4baa-8b4c-a1c356e38679" (UID: "8a0229a2-b07d-4baa-8b4c-a1c356e38679"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.254728 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.266644 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.267674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e75ef50-1c0b-498e-8448-39a7c8912f96" (UID: "7e75ef50-1c0b-498e-8448-39a7c8912f96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269065 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269097 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqg75\" (UniqueName: \"kubernetes.io/projected/7e75ef50-1c0b-498e-8448-39a7c8912f96-kube-api-access-xqg75\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269107 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x87zp\" (UniqueName: \"kubernetes.io/projected/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-kube-api-access-x87zp\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269116 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269125 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269134 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269145 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269154 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269163 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269172 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e75ef50-1c0b-498e-8448-39a7c8912f96-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.269180 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.291804 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.292256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdf02d6a-5794-4b1d-b155-f683bdb8680d" (UID: "bdf02d6a-5794-4b1d-b155-f683bdb8680d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.358068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bdf02d6a-5794-4b1d-b155-f683bdb8680d" (UID: "bdf02d6a-5794-4b1d-b155-f683bdb8680d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.373765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dafd71a4-7276-4bce-84d9-6568e9d38d9d" (UID: "dafd71a4-7276-4bce-84d9-6568e9d38d9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.375257 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.375271 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.375281 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf02d6a-5794-4b1d-b155-f683bdb8680d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.375292 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.376016 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-config" (OuterVolumeSpecName: "config") pod "dafd71a4-7276-4bce-84d9-6568e9d38d9d" (UID: "dafd71a4-7276-4bce-84d9-6568e9d38d9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.382924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.383199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dafd71a4-7276-4bce-84d9-6568e9d38d9d" (UID: "dafd71a4-7276-4bce-84d9-6568e9d38d9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.396277 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dafd71a4-7276-4bce-84d9-6568e9d38d9d" (UID: "dafd71a4-7276-4bce-84d9-6568e9d38d9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.396731 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "99aedb1b-bca3-41ef-9399-4678f86ac87c" (UID: "99aedb1b-bca3-41ef-9399-4678f86ac87c"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.418394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.420044 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.421119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-config-data" (OuterVolumeSpecName: "config-data") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.422868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.425963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.429111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" (UID: "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.466878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dafd71a4-7276-4bce-84d9-6568e9d38d9d" (UID: "dafd71a4-7276-4bce-84d9-6568e9d38d9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480868 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480892 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480902 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480913 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480921 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480930 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480939 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480947 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480958 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99aedb1b-bca3-41ef-9399-4678f86ac87c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.480966 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dafd71a4-7276-4bce-84d9-6568e9d38d9d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.481016 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.481053 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data podName:1feaa4c9-2cec-45a8-9106-5be885c26eae nodeName:}" failed. No retries permitted until 2026-03-10 16:12:33.481040158 +0000 UTC m=+1450.602905845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data") pod "rabbitmq-server-0" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae") : configmap "rabbitmq-config-data" not found Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.491765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data" (OuterVolumeSpecName: "config-data") pod "7e75ef50-1c0b-498e-8448-39a7c8912f96" (UID: "7e75ef50-1c0b-498e-8448-39a7c8912f96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.500533 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.512307 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-l8hzh"] Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.529760 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8701-account-create-update-nnbrt"] Mar 10 16:12:29 crc kubenswrapper[4749]: W0310 16:12:29.545137 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783a67d8_9e22_4503_82a6_5f49fb50ee7b.slice/crio-8e5e2f4f3ad0cf18460eca25f65292620b9f05026a53e092b43ab34352ddf401 WatchSource:0}: Error finding container 8e5e2f4f3ad0cf18460eca25f65292620b9f05026a53e092b43ab34352ddf401: Status 404 returned error can't find the container with id 8e5e2f4f3ad0cf18460eca25f65292620b9f05026a53e092b43ab34352ddf401 Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.553207 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vvxvc"] Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.559352 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 16:12:29 crc kubenswrapper[4749]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: if [ -n "glance" ]; then Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="glance" Mar 10 16:12:29 crc kubenswrapper[4749]: else Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="*" Mar 10 16:12:29 crc kubenswrapper[4749]: fi Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: # going for maximum compatibility here: Mar 10 16:12:29 crc kubenswrapper[4749]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 16:12:29 crc kubenswrapper[4749]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 16:12:29 crc kubenswrapper[4749]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 16:12:29 crc kubenswrapper[4749]: # support updates Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: $MYSQL_CMD < logger="UnhandledError" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.561481 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 16:12:29 crc kubenswrapper[4749]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: if [ -n "nova_cell0" ]; then Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="nova_cell0" Mar 10 16:12:29 crc kubenswrapper[4749]: else Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="*" Mar 10 16:12:29 crc kubenswrapper[4749]: fi Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: # going for maximum compatibility here: Mar 10 16:12:29 crc kubenswrapper[4749]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 16:12:29 crc kubenswrapper[4749]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 16:12:29 crc kubenswrapper[4749]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 16:12:29 crc kubenswrapper[4749]: # support updates Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: $MYSQL_CMD < logger="UnhandledError" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.561500 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 16:12:29 crc kubenswrapper[4749]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: if [ -n "nova_cell1" ]; then Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="nova_cell1" Mar 10 16:12:29 crc kubenswrapper[4749]: else Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="*" Mar 10 16:12:29 crc kubenswrapper[4749]: fi Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: # going for maximum compatibility here: Mar 10 16:12:29 crc kubenswrapper[4749]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 16:12:29 crc kubenswrapper[4749]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 16:12:29 crc kubenswrapper[4749]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 16:12:29 crc kubenswrapper[4749]: # support updates Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: $MYSQL_CMD < logger="UnhandledError" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.564401 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-lqrp8"] Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.568027 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" podUID="615e021a-88a2-496f-81a4-46d70e40310d" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.568084 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" podUID="d5ef97dd-388e-4fc7-82bf-908c61ca2fe2" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.568107 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-8701-account-create-update-nnbrt" podUID="783a67d8-9e22-4503-82a6-5f49fb50ee7b" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.570921 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-116d-account-create-update-t69qj"] Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.574527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8a0229a2-b07d-4baa-8b4c-a1c356e38679" (UID: "8a0229a2-b07d-4baa-8b4c-a1c356e38679"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.577848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "852c97ea-349d-4262-b36c-2ef7aa81ae75" (UID: "852c97ea-349d-4262-b36c-2ef7aa81ae75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgrfd\" (UniqueName: \"kubernetes.io/projected/b5ba9db0-29a2-468a-ab78-871620e30790-kube-api-access-lgrfd\") pod \"b5ba9db0-29a2-468a-ab78-871620e30790\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-config-data\") pod \"b5ba9db0-29a2-468a-ab78-871620e30790\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-nova-novncproxy-tls-certs\") pod \"b5ba9db0-29a2-468a-ab78-871620e30790\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshzj\" (UniqueName: \"kubernetes.io/projected/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kube-api-access-wshzj\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-galera-tls-certs\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583321 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-combined-ca-bundle\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-combined-ca-bundle\") pod \"b5ba9db0-29a2-468a-ab78-871620e30790\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583425 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-default\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-generated\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kolla-config\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583533 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-vencrypt-tls-certs\") pod \"b5ba9db0-29a2-468a-ab78-871620e30790\" (UID: \"b5ba9db0-29a2-468a-ab78-871620e30790\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.583554 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-operator-scripts\") pod \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\" (UID: \"feb87bc1-b9a8-44e7-8603-ba656ef9e65c\") " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.584010 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.584022 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e75ef50-1c0b-498e-8448-39a7c8912f96-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.587503 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0229a2-b07d-4baa-8b4c-a1c356e38679-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.587532 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/852c97ea-349d-4262-b36c-2ef7aa81ae75-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.589773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.594567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data" (OuterVolumeSpecName: "config-data") pod "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" (UID: "8885a49f-7161-40b8-aac2-ee8ad3e0a1b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.596597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ba9db0-29a2-468a-ab78-871620e30790-kube-api-access-lgrfd" (OuterVolumeSpecName: "kube-api-access-lgrfd") pod "b5ba9db0-29a2-468a-ab78-871620e30790" (UID: "b5ba9db0-29a2-468a-ab78-871620e30790"). InnerVolumeSpecName "kube-api-access-lgrfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.598629 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 16:12:29 crc kubenswrapper[4749]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: if [ -n "nova_api" ]; then Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="nova_api" Mar 10 16:12:29 crc kubenswrapper[4749]: else Mar 10 16:12:29 crc kubenswrapper[4749]: GRANT_DATABASE="*" Mar 10 16:12:29 crc kubenswrapper[4749]: fi Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: # going for maximum compatibility here: Mar 10 16:12:29 crc kubenswrapper[4749]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 16:12:29 crc kubenswrapper[4749]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 16:12:29 crc kubenswrapper[4749]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 16:12:29 crc kubenswrapper[4749]: # support updates Mar 10 16:12:29 crc kubenswrapper[4749]: Mar 10 16:12:29 crc kubenswrapper[4749]: $MYSQL_CMD < logger="UnhandledError" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.600309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: E0310 16:12:29.600411 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-116d-account-create-update-t69qj" podUID="101ece94-d304-4797-a87d-e7fc8deb6199" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.601680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.626902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b3814c41-600a-4463-9695-e55c293ffead" (UID: "b3814c41-600a-4463-9695-e55c293ffead"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.627912 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.635580 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9945ae2b-1140-4eb6-8212-c56f874dc891" path="/var/lib/kubelet/pods/9945ae2b-1140-4eb6-8212-c56f874dc891/volumes" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.636185 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a738c4f5-0130-4bf4-85a4-ebbf9188ac51" path="/var/lib/kubelet/pods/a738c4f5-0130-4bf4-85a4-ebbf9188ac51/volumes" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.636895 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf02d6a-5794-4b1d-b155-f683bdb8680d" path="/var/lib/kubelet/pods/bdf02d6a-5794-4b1d-b155-f683bdb8680d/volumes" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.676547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694273 4749 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694304 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694313 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgrfd\" (UniqueName: \"kubernetes.io/projected/b5ba9db0-29a2-468a-ab78-871620e30790-kube-api-access-lgrfd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694323 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3814c41-600a-4463-9695-e55c293ffead-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694341 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694351 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694360 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.694368 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.762705 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kube-api-access-wshzj" (OuterVolumeSpecName: "kube-api-access-wshzj") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "kube-api-access-wshzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.886736 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.887329 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-central-agent" containerID="cri-o://595e6a774c6f1fb6971d897ceee5714bc8b70476939e4b04c3cdfc26a133bd65" gracePeriod=30 Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.895905 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="proxy-httpd" containerID="cri-o://d4163978450ae5a28c7305f78e151c8b39face70e710fef1aa4e65399f74f360" gracePeriod=30 Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.896015 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="sg-core" containerID="cri-o://a7e54b42d006c7f4c24dab0c52ee76e67b32f801f6a37cf60527b10f12948e8b" gracePeriod=30 Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.896065 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-notification-agent" containerID="cri-o://fd96a7ab3a64b263ae86578a46b4ac785d6b725cfd0504260b8b86b6c6c66caa" gracePeriod=30 Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.942820 4749 generic.go:334] "Generic (PLEG): container finished" podID="b5ba9db0-29a2-468a-ab78-871620e30790" containerID="f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae" exitCode=0 Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.942919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b5ba9db0-29a2-468a-ab78-871620e30790","Type":"ContainerDied","Data":"f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae"} Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.942946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b5ba9db0-29a2-468a-ab78-871620e30790","Type":"ContainerDied","Data":"33406fead6089a5d9460b37d43d2b0208cbf151ffbc0eab27176c0dee2182fba"} Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.942973 4749 scope.go:117] "RemoveContainer" containerID="f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.943131 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.955813 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshzj\" (UniqueName: \"kubernetes.io/projected/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-kube-api-access-wshzj\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:29 crc kubenswrapper[4749]: I0310 16:12:29.968546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-config-data" (OuterVolumeSpecName: "config-data") pod "b5ba9db0-29a2-468a-ab78-871620e30790" (UID: "b5ba9db0-29a2-468a-ab78-871620e30790"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.018798 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.019043 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0a8246b1-28b8-4eb6-83a3-1e87beecfb78" containerName="kube-state-metrics" containerID="cri-o://fc7c26df644965a11d66d3971b0f14d486ea00dfbfe753e9e176059c55e661ad" gracePeriod=30 Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.022827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxvc" event={"ID":"1ac12213-5bcb-465c-a6aa-fa9e8e97c290","Type":"ContainerStarted","Data":"5ab5b8a23a717e3d7583ae90ab05cc659a2772d63ab4a793dc298238452ec013"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.023056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "b5ba9db0-29a2-468a-ab78-871620e30790" (UID: "b5ba9db0-29a2-468a-ab78-871620e30790"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.028863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.050624 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "b5ba9db0-29a2-468a-ab78-871620e30790" (UID: "b5ba9db0-29a2-468a-ab78-871620e30790"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.087945 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.089389 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerID="9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8" exitCode=0 Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.089505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" event={"ID":"7e75ef50-1c0b-498e-8448-39a7c8912f96","Type":"ContainerDied","Data":"9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.089540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" event={"ID":"7e75ef50-1c0b-498e-8448-39a7c8912f96","Type":"ContainerDied","Data":"f6cbc30e46a0924e0dc742ff7bb9f702d15b8ccd9fe4cf6cff5c479718b4a71f"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.089647 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94bd49868-nj59v" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.167893 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.168101 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="ec710cfc-8539-47c5-8062-95911f973074" containerName="memcached" containerID="cri-o://62c6ec9ae9969d0a788db39c30b51835114c25e683ecb1bdacaa7737d2e96d89" gracePeriod=30 Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.169261 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.169286 4749 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.169297 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.169306 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.169314 4749 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.210594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67dd78ff7-qfbxb" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.215192 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0bf0-account-create-update-pxvvz"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.215230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67dd78ff7-qfbxb" event={"ID":"852c97ea-349d-4262-b36c-2ef7aa81ae75","Type":"ContainerDied","Data":"7e49d1de40d2ec08bea683ec20f01f5bf6b672b490dd965c02c35e2420ea56a1"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.220125 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0bf0-account-create-update-pxvvz"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.225971 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0bf0-account-create-update-6sg9m"] Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226517 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-httpd" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226535 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-httpd" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226548 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="ovsdbserver-sb" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226557 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="ovsdbserver-sb" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226582 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="init" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226590 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="init" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226608 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="dnsmasq-dns" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226616 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="dnsmasq-dns" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226639 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226648 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226661 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0229a2-b07d-4baa-8b4c-a1c356e38679" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226669 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0229a2-b07d-4baa-8b4c-a1c356e38679" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ba9db0-29a2-468a-ab78-871620e30790" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226689 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ba9db0-29a2-468a-ab78-871620e30790" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226703 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker-log" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226711 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker-log" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226723 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226730 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226742 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener-log" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226749 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener-log" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226766 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerName="galera" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226775 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerName="galera" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226789 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226796 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226811 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="ovsdbserver-nb" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226820 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="ovsdbserver-nb" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226832 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerName="mysql-bootstrap" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226839 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerName="mysql-bootstrap" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226850 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-server" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226858 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-server" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.226878 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.226885 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227129 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227147 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-httpd" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227162 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" containerName="barbican-worker-log" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227177 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerName="galera" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227190 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ba9db0-29a2-468a-ab78-871620e30790" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227198 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227209 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" containerName="proxy-server" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227228 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227242 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3814c41-600a-4463-9695-e55c293ffead" containerName="ovsdbserver-nb" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227255 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="ovsdbserver-sb" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227270 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="dnsmasq-dns" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227283 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" containerName="barbican-keystone-listener-log" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227298 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0229a2-b07d-4baa-8b4c-a1c356e38679" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.227309 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" containerName="openstack-network-exporter" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.228122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.235477 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.235762 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0bf0-account-create-update-6sg9m"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.238591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "feb87bc1-b9a8-44e7-8603-ba656ef9e65c" (UID: "feb87bc1-b9a8-44e7-8603-ba656ef9e65c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.238982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5ba9db0-29a2-468a-ab78-871620e30790" (UID: "b5ba9db0-29a2-468a-ab78-871620e30790"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.261425 4749 generic.go:334] "Generic (PLEG): container finished" podID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" containerID="0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f" exitCode=0 Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.261542 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.262936 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vxg2v"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.262989 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vxg2v"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.263011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"feb87bc1-b9a8-44e7-8603-ba656ef9e65c","Type":"ContainerDied","Data":"0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.263049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"feb87bc1-b9a8-44e7-8603-ba656ef9e65c","Type":"ContainerDied","Data":"f08fdf2117dc8e63c0bf505cbe956ff8f16bd161d742f808cead2769cc5da5c9"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.263476 4749 scope.go:117] "RemoveContainer" containerID="f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.265573 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae\": container with ID starting with f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae not found: ID does not exist" containerID="f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.265622 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae"} err="failed to get container status \"f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae\": rpc error: code = NotFound desc = could not find container \"f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae\": container with ID starting with f431f76071d5cc54434dee7be2f8613b19cde266ddb731b3a3189507c34c42ae not found: ID does not exist" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.265647 4749 scope.go:117] "RemoveContainer" containerID="9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.277552 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ba9db0-29a2-468a-ab78-871620e30790-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.277583 4749 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb87bc1-b9a8-44e7-8603-ba656ef9e65c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.282038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-116d-account-create-update-t69qj" event={"ID":"101ece94-d304-4797-a87d-e7fc8deb6199","Type":"ContainerStarted","Data":"f2d9a57c0383a8d8797d1aeb09493f17c0da654d73069ec678a1c4a778403e91"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.294273 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mrdvb"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.300685 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": read tcp 10.217.0.2:36326->10.217.0.170:8776: read: connection reset by peer" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.305553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8701-account-create-update-nnbrt" event={"ID":"783a67d8-9e22-4503-82a6-5f49fb50ee7b","Type":"ContainerStarted","Data":"8e5e2f4f3ad0cf18460eca25f65292620b9f05026a53e092b43ab34352ddf401"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.311081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" event={"ID":"615e021a-88a2-496f-81a4-46d70e40310d","Type":"ContainerStarted","Data":"f8d328470fa7460fe06631344b66d1d5dd32299fc1a6c488f121b8bff2e8efc7"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.317924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67dc88fb49-f9s7n" event={"ID":"8885a49f-7161-40b8-aac2-ee8ad3e0a1b3","Type":"ContainerDied","Data":"da32b1813adbf953923f98ec50815b367083a3884d4405627ee737ccbf076a03"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.318088 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67dc88fb49-f9s7n" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.319203 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mrdvb"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.321130 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.326217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" event={"ID":"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2","Type":"ContainerStarted","Data":"dfea804dcf2063efc860f1b53dc44ee184709b2291d2046b719eac794bfbe9ce"} Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.328616 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.328690 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-wptw6" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.328727 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db588689-ff8h6" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.328757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.331166 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d6cd8c57d-9v7dx"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.331537 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-d6cd8c57d-9v7dx" podUID="a7637a97-25f4-4696-a41c-545d0d6b0e9a" containerName="keystone-api" containerID="cri-o://261594765d431b29d11923174d8f5b406353566732ec1cf061a23975229933f0" gracePeriod=30 Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.363001 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.379202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74pp\" (UniqueName: \"kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.379346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.379491 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.379537 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data podName:d34f67ec-ba88-43c9-84af-2c59a2dbbbe3 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:34.379522354 +0000 UTC m=+1451.501388041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data") pod "rabbitmq-cell1-server-0" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3") : configmap "rabbitmq-cell1-config-data" not found Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.397521 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q5g9w"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.410089 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0bf0-account-create-update-6sg9m"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.439202 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q5g9w"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.467830 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vvxvc"] Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.487820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74pp\" (UniqueName: \"kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.487985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.488114 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.488165 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts podName:9227a6fc-a568-4df2-be6f-10e0eeb154d1 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:30.988151253 +0000 UTC m=+1448.110016940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts") pod "keystone-0bf0-account-create-update-6sg9m" (UID: "9227a6fc-a568-4df2-be6f-10e0eeb154d1") : configmap "openstack-scripts" not found Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.496091 4749 projected.go:194] Error preparing data for projected volume kube-api-access-z74pp for pod openstack/keystone-0bf0-account-create-update-6sg9m: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 16:12:30 crc kubenswrapper[4749]: E0310 16:12:30.496152 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp podName:9227a6fc-a568-4df2-be6f-10e0eeb154d1 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:30.996135224 +0000 UTC m=+1448.118000911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z74pp" (UniqueName: "kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp") pod "keystone-0bf0-account-create-update-6sg9m" (UID: "9227a6fc-a568-4df2-be6f-10e0eeb154d1") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 16:12:30 crc kubenswrapper[4749]: I0310 16:12:30.758947 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerName="galera" containerID="cri-o://cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73" gracePeriod=30 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.001389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.001787 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.001827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74pp\" (UniqueName: \"kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.001864 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts podName:9227a6fc-a568-4df2-be6f-10e0eeb154d1 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:32.001838412 +0000 UTC m=+1449.123704159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts") pod "keystone-0bf0-account-create-update-6sg9m" (UID: "9227a6fc-a568-4df2-be6f-10e0eeb154d1") : configmap "openstack-scripts" not found Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.005039 4749 projected.go:194] Error preparing data for projected volume kube-api-access-z74pp for pod openstack/keystone-0bf0-account-create-update-6sg9m: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.005110 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp podName:9227a6fc-a568-4df2-be6f-10e0eeb154d1 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:32.005091772 +0000 UTC m=+1449.126957459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z74pp" (UniqueName: "kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp") pod "keystone-0bf0-account-create-update-6sg9m" (UID: "9227a6fc-a568-4df2-be6f-10e0eeb154d1") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.226857 4749 scope.go:117] "RemoveContainer" containerID="9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.271309 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.304392 4749 scope.go:117] "RemoveContainer" containerID="9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8" Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.312438 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8\": container with ID starting with 9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8 not found: ID does not exist" containerID="9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.312493 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8"} err="failed to get container status \"9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8\": rpc error: code = NotFound desc = could not find container \"9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8\": container with ID starting with 9439fbc59b1f401d9aeefa7b3aad4d384a86e1422ff6cf26455e69cb43403ea8 not found: ID does not exist" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.312524 4749 scope.go:117] "RemoveContainer" containerID="9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb" Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.313245 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb\": container with ID starting with 9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb not found: ID does not exist" containerID="9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.313274 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb"} err="failed to get container status \"9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb\": rpc error: code = NotFound desc = could not find container \"9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb\": container with ID starting with 9ab5efab7e6f316d75eb83fd5392310a8b177a64854f2535c92533beb31eecdb not found: ID does not exist" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.313290 4749 scope.go:117] "RemoveContainer" containerID="fb1538a41411d8393adf1f0c90bcd1848a6d9fc8136457b097afbfd0c4663176" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.324124 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.333149 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-94bd49868-nj59v"] Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.341754 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.347452 4749 generic.go:334] "Generic (PLEG): container finished" podID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerID="98105fccfb28ffdfad3a0b356c5f8eb37b06bcf57c45dd3c31713662a31ddfe5" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.347518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e","Type":"ContainerDied","Data":"98105fccfb28ffdfad3a0b356c5f8eb37b06bcf57c45dd3c31713662a31ddfe5"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.358193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-116d-account-create-update-t69qj" event={"ID":"101ece94-d304-4797-a87d-e7fc8deb6199","Type":"ContainerDied","Data":"f2d9a57c0383a8d8797d1aeb09493f17c0da654d73069ec678a1c4a778403e91"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.358406 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-116d-account-create-update-t69qj" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.367134 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-94bd49868-nj59v"] Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.367869 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-z74pp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-0bf0-account-create-update-6sg9m" podUID="9227a6fc-a568-4df2-be6f-10e0eeb154d1" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.369012 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.369140 4749 generic.go:334] "Generic (PLEG): container finished" podID="d61221be-c05f-47ae-a3b5-80f59d809281" containerID="223e73b546b611a128d4581fd3fab7f4ad5f58cffc7f3d629e05eb77a8f22f97" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.369186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d61221be-c05f-47ae-a3b5-80f59d809281","Type":"ContainerDied","Data":"223e73b546b611a128d4581fd3fab7f4ad5f58cffc7f3d629e05eb77a8f22f97"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.373072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8701-account-create-update-nnbrt" event={"ID":"783a67d8-9e22-4503-82a6-5f49fb50ee7b","Type":"ContainerDied","Data":"8e5e2f4f3ad0cf18460eca25f65292620b9f05026a53e092b43ab34352ddf401"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.373102 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e5e2f4f3ad0cf18460eca25f65292620b9f05026a53e092b43ab34352ddf401" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.373824 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.380646 4749 generic.go:334] "Generic (PLEG): container finished" podID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerID="236964c999aec36cddb5fe0239f2b923e3a235f3f2a6498c0a9402202498207b" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.380735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15480433-b4c2-47c5-a7e4-73395b5bd27d","Type":"ContainerDied","Data":"236964c999aec36cddb5fe0239f2b923e3a235f3f2a6498c0a9402202498207b"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.380763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15480433-b4c2-47c5-a7e4-73395b5bd27d","Type":"ContainerDied","Data":"155ef0556de8c3381936aa3a148a225cfea9fc1e296109ceebf18e89c3c6a91b"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.380776 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="155ef0556de8c3381936aa3a148a225cfea9fc1e296109ceebf18e89c3c6a91b" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.383101 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a8246b1-28b8-4eb6-83a3-1e87beecfb78" containerID="fc7c26df644965a11d66d3971b0f14d486ea00dfbfe753e9e176059c55e661ad" exitCode=2 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.383193 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db588689-ff8h6"] Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.384823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a8246b1-28b8-4eb6-83a3-1e87beecfb78","Type":"ContainerDied","Data":"fc7c26df644965a11d66d3971b0f14d486ea00dfbfe753e9e176059c55e661ad"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.388101 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.389290 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerID="d4163978450ae5a28c7305f78e151c8b39face70e710fef1aa4e65399f74f360" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.389319 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerID="a7e54b42d006c7f4c24dab0c52ee76e67b32f801f6a37cf60527b10f12948e8b" exitCode=2 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.389328 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerID="595e6a774c6f1fb6971d897ceee5714bc8b70476939e4b04c3cdfc26a133bd65" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.389378 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerDied","Data":"d4163978450ae5a28c7305f78e151c8b39face70e710fef1aa4e65399f74f360"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.389420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerDied","Data":"a7e54b42d006c7f4c24dab0c52ee76e67b32f801f6a37cf60527b10f12948e8b"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.389431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerDied","Data":"595e6a774c6f1fb6971d897ceee5714bc8b70476939e4b04c3cdfc26a133bd65"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.404827 4749 scope.go:117] "RemoveContainer" containerID="aaeec1a32c2eac40dd562bef0ec6d26e9e7922c02915dd496d085b035d04bd0f" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.404971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" event={"ID":"615e021a-88a2-496f-81a4-46d70e40310d","Type":"ContainerDied","Data":"f8d328470fa7460fe06631344b66d1d5dd32299fc1a6c488f121b8bff2e8efc7"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.405035 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-88fa-account-create-update-lqrp8" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.408281 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57db588689-ff8h6"] Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg9sc\" (UniqueName: \"kubernetes.io/projected/7cc64163-530a-4b31-9acc-84910336b781-kube-api-access-tg9sc\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-internal-tls-certs\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409591 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-scripts\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-combined-ca-bundle\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gb86\" (UniqueName: \"kubernetes.io/projected/615e021a-88a2-496f-81a4-46d70e40310d-kube-api-access-5gb86\") pod \"615e021a-88a2-496f-81a4-46d70e40310d\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-config-data\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409782 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615e021a-88a2-496f-81a4-46d70e40310d-operator-scripts\") pod \"615e021a-88a2-496f-81a4-46d70e40310d\" (UID: \"615e021a-88a2-496f-81a4-46d70e40310d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cc64163-530a-4b31-9acc-84910336b781-logs\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.409836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-public-tls-certs\") pod \"7cc64163-530a-4b31-9acc-84910336b781\" (UID: \"7cc64163-530a-4b31-9acc-84910336b781\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.430995 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615e021a-88a2-496f-81a4-46d70e40310d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "615e021a-88a2-496f-81a4-46d70e40310d" (UID: "615e021a-88a2-496f-81a4-46d70e40310d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.432783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc64163-530a-4b31-9acc-84910336b781-logs" (OuterVolumeSpecName: "logs") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.433292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc64163-530a-4b31-9acc-84910336b781-kube-api-access-tg9sc" (OuterVolumeSpecName: "kube-api-access-tg9sc") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "kube-api-access-tg9sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.437048 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec710cfc-8539-47c5-8062-95911f973074" containerID="62c6ec9ae9969d0a788db39c30b51835114c25e683ecb1bdacaa7737d2e96d89" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.437149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ec710cfc-8539-47c5-8062-95911f973074","Type":"ContainerDied","Data":"62c6ec9ae9969d0a788db39c30b51835114c25e683ecb1bdacaa7737d2e96d89"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.446121 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.446513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-scripts" (OuterVolumeSpecName: "scripts") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.457641 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e is running failed: container process not found" containerID="c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.464027 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e is running failed: container process not found" containerID="c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.464658 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerID="f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.464934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0d845ea-a98a-43ae-9803-30e5d306d29d","Type":"ContainerDied","Data":"f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.465012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0d845ea-a98a-43ae-9803-30e5d306d29d","Type":"ContainerDied","Data":"c097dd8012cac30d05e75fe83efdca1aaa8a554789942b137dbade44d1339656"} Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.465065 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e is running failed: container process not found" containerID="c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:31 crc kubenswrapper[4749]: E0310 16:12:31.465106 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e80985ef-0a5d-403a-b351-c59bd878723d" containerName="nova-cell1-conductor-conductor" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.465016 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.476091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" event={"ID":"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2","Type":"ContainerDied","Data":"dfea804dcf2063efc860f1b53dc44ee184709b2291d2046b719eac794bfbe9ce"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.477059 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-939f-account-create-update-l8hzh" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.481221 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-67dc88fb49-f9s7n"] Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.502393 4749 generic.go:334] "Generic (PLEG): container finished" podID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerID="a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.502471 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b864884-n5l5z" event={"ID":"fd8a90f3-a6d3-428e-a049-78cb36e2ed34","Type":"ContainerDied","Data":"a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.502507 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b864884-n5l5z" event={"ID":"fd8a90f3-a6d3-428e-a049-78cb36e2ed34","Type":"ContainerDied","Data":"1f59993e0f183c523897becf9b929953b64a82be498862ee5a179b1f59b4563d"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.502594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f7b864884-n5l5z" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.504732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615e021a-88a2-496f-81a4-46d70e40310d-kube-api-access-5gb86" (OuterVolumeSpecName: "kube-api-access-5gb86") pod "615e021a-88a2-496f-81a4-46d70e40310d" (UID: "615e021a-88a2-496f-81a4-46d70e40310d"). InnerVolumeSpecName "kube-api-access-5gb86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.508033 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-67dc88fb49-f9s7n"] Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-combined-ca-bundle\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783a67d8-9e22-4503-82a6-5f49fb50ee7b-operator-scripts\") pod \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-public-tls-certs\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-scripts\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxd9v\" (UniqueName: \"kubernetes.io/projected/15480433-b4c2-47c5-a7e4-73395b5bd27d-kube-api-access-wxd9v\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-httpd-run\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srx6t\" (UniqueName: \"kubernetes.io/projected/783a67d8-9e22-4503-82a6-5f49fb50ee7b-kube-api-access-srx6t\") pod \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\" (UID: \"783a67d8-9e22-4503-82a6-5f49fb50ee7b\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-config-data\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512598 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z62h5\" (UniqueName: \"kubernetes.io/projected/101ece94-d304-4797-a87d-e7fc8deb6199-kube-api-access-z62h5\") pod \"101ece94-d304-4797-a87d-e7fc8deb6199\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-operator-scripts\") pod \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-logs\") pod \"15480433-b4c2-47c5-a7e4-73395b5bd27d\" (UID: \"15480433-b4c2-47c5-a7e4-73395b5bd27d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d59n\" (UniqueName: \"kubernetes.io/projected/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-kube-api-access-4d59n\") pod \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\" (UID: \"d5ef97dd-388e-4fc7-82bf-908c61ca2fe2\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.512735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/101ece94-d304-4797-a87d-e7fc8deb6199-operator-scripts\") pod \"101ece94-d304-4797-a87d-e7fc8deb6199\" (UID: \"101ece94-d304-4797-a87d-e7fc8deb6199\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.513211 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/101ece94-d304-4797-a87d-e7fc8deb6199-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "101ece94-d304-4797-a87d-e7fc8deb6199" (UID: "101ece94-d304-4797-a87d-e7fc8deb6199"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.514909 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615e021a-88a2-496f-81a4-46d70e40310d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.514937 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cc64163-530a-4b31-9acc-84910336b781-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.514948 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/101ece94-d304-4797-a87d-e7fc8deb6199-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.514987 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg9sc\" (UniqueName: \"kubernetes.io/projected/7cc64163-530a-4b31-9acc-84910336b781-kube-api-access-tg9sc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.514999 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.515011 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gb86\" (UniqueName: \"kubernetes.io/projected/615e021a-88a2-496f-81a4-46d70e40310d-kube-api-access-5gb86\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.515133 4749 scope.go:117] "RemoveContainer" containerID="0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.516031 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.516363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5ef97dd-388e-4fc7-82bf-908c61ca2fe2" (UID: "d5ef97dd-388e-4fc7-82bf-908c61ca2fe2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.516727 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-logs" (OuterVolumeSpecName: "logs") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.516973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783a67d8-9e22-4503-82a6-5f49fb50ee7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "783a67d8-9e22-4503-82a6-5f49fb50ee7b" (UID: "783a67d8-9e22-4503-82a6-5f49fb50ee7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.534680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.535581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/101ece94-d304-4797-a87d-e7fc8deb6199-kube-api-access-z62h5" (OuterVolumeSpecName: "kube-api-access-z62h5") pod "101ece94-d304-4797-a87d-e7fc8deb6199" (UID: "101ece94-d304-4797-a87d-e7fc8deb6199"). InnerVolumeSpecName "kube-api-access-z62h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.536149 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cc64163-530a-4b31-9acc-84910336b781" containerID="57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.536400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8dfcffcf6-962bk" event={"ID":"7cc64163-530a-4b31-9acc-84910336b781","Type":"ContainerDied","Data":"57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.536581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8dfcffcf6-962bk" event={"ID":"7cc64163-530a-4b31-9acc-84910336b781","Type":"ContainerDied","Data":"b37ba40d7206c0af8fcc7ac5985e11349cbc974a49612fd2dc99c32c5aa1f203"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.536783 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8dfcffcf6-962bk" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.544950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783a67d8-9e22-4503-82a6-5f49fb50ee7b-kube-api-access-srx6t" (OuterVolumeSpecName: "kube-api-access-srx6t") pod "783a67d8-9e22-4503-82a6-5f49fb50ee7b" (UID: "783a67d8-9e22-4503-82a6-5f49fb50ee7b"). InnerVolumeSpecName "kube-api-access-srx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.546879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-kube-api-access-4d59n" (OuterVolumeSpecName: "kube-api-access-4d59n") pod "d5ef97dd-388e-4fc7-82bf-908c61ca2fe2" (UID: "d5ef97dd-388e-4fc7-82bf-908c61ca2fe2"). InnerVolumeSpecName "kube-api-access-4d59n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.552330 4749 generic.go:334] "Generic (PLEG): container finished" podID="1b598099-b3f7-4157-8e5f-6eb472806511" containerID="5d60d1aa5d24cbc57ff5075376de396d004cbb8a0f8a549e929e2a81a8d75bd4" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.552459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1b598099-b3f7-4157-8e5f-6eb472806511","Type":"ContainerDied","Data":"5d60d1aa5d24cbc57ff5075376de396d004cbb8a0f8a549e929e2a81a8d75bd4"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.562557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15480433-b4c2-47c5-a7e4-73395b5bd27d-kube-api-access-wxd9v" (OuterVolumeSpecName: "kube-api-access-wxd9v") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "kube-api-access-wxd9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.563120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-scripts" (OuterVolumeSpecName: "scripts") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.566567 4749 generic.go:334] "Generic (PLEG): container finished" podID="e80985ef-0a5d-403a-b351-c59bd878723d" containerID="c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e" exitCode=0 Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.566632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e80985ef-0a5d-403a-b351-c59bd878723d","Type":"ContainerDied","Data":"c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e"} Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.615018 4749 scope.go:117] "RemoveContainer" containerID="dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617028 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-internal-tls-certs\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-internal-tls-certs\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data-custom\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-kube-api-access-px8j6\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-public-tls-certs\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data-custom\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-logs\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-public-tls-certs\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617382 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ppp\" (UniqueName: \"kubernetes.io/projected/a0d845ea-a98a-43ae-9803-30e5d306d29d-kube-api-access-j2ppp\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0d845ea-a98a-43ae-9803-30e5d306d29d-etc-machine-id\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-scripts\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617503 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617529 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-combined-ca-bundle\") pod \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\" (UID: \"fd8a90f3-a6d3-428e-a049-78cb36e2ed34\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-combined-ca-bundle\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617601 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d845ea-a98a-43ae-9803-30e5d306d29d-logs\") pod \"a0d845ea-a98a-43ae-9803-30e5d306d29d\" (UID: \"a0d845ea-a98a-43ae-9803-30e5d306d29d\") " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617942 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/783a67d8-9e22-4503-82a6-5f49fb50ee7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617952 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617963 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxd9v\" (UniqueName: \"kubernetes.io/projected/15480433-b4c2-47c5-a7e4-73395b5bd27d-kube-api-access-wxd9v\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617982 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.617991 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.618002 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srx6t\" (UniqueName: \"kubernetes.io/projected/783a67d8-9e22-4503-82a6-5f49fb50ee7b-kube-api-access-srx6t\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.618011 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z62h5\" (UniqueName: \"kubernetes.io/projected/101ece94-d304-4797-a87d-e7fc8deb6199-kube-api-access-z62h5\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.618025 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.618033 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15480433-b4c2-47c5-a7e4-73395b5bd27d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.618043 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d59n\" (UniqueName: \"kubernetes.io/projected/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2-kube-api-access-4d59n\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.626984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-logs" (OuterVolumeSpecName: "logs") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.628766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0d845ea-a98a-43ae-9803-30e5d306d29d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.634263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d845ea-a98a-43ae-9803-30e5d306d29d-logs" (OuterVolumeSpecName: "logs") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.646875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.653344 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbeb14b-95ab-438e-b3e5-be66a6c34188" path="/var/lib/kubelet/pods/5dbeb14b-95ab-438e-b3e5-be66a6c34188/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.654707 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d73d287-764a-4274-a18c-f38c42e85d2d" path="/var/lib/kubelet/pods/6d73d287-764a-4274-a18c-f38c42e85d2d/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.656330 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e75ef50-1c0b-498e-8448-39a7c8912f96" path="/var/lib/kubelet/pods/7e75ef50-1c0b-498e-8448-39a7c8912f96/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.656545 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-kube-api-access-px8j6" (OuterVolumeSpecName: "kube-api-access-px8j6") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "kube-api-access-px8j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.661510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.664011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d845ea-a98a-43ae-9803-30e5d306d29d-kube-api-access-j2ppp" (OuterVolumeSpecName: "kube-api-access-j2ppp") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "kube-api-access-j2ppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.666603 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.667077 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-scripts" (OuterVolumeSpecName: "scripts") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.671718 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846a5266-babb-4653-8226-952d8e09d90e" path="/var/lib/kubelet/pods/846a5266-babb-4653-8226-952d8e09d90e/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.674168 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8885a49f-7161-40b8-aac2-ee8ad3e0a1b3" path="/var/lib/kubelet/pods/8885a49f-7161-40b8-aac2-ee8ad3e0a1b3/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.677775 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1120a11-86ed-4714-891a-04bcba2a8ea2" path="/var/lib/kubelet/pods/d1120a11-86ed-4714-891a-04bcba2a8ea2/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.679547 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" path="/var/lib/kubelet/pods/dafd71a4-7276-4bce-84d9-6568e9d38d9d/volumes" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741109 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741140 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px8j6\" (UniqueName: \"kubernetes.io/projected/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-kube-api-access-px8j6\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741154 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741167 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741180 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741191 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ppp\" (UniqueName: \"kubernetes.io/projected/a0d845ea-a98a-43ae-9803-30e5d306d29d-kube-api-access-j2ppp\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741204 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0d845ea-a98a-43ae-9803-30e5d306d29d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741215 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.741225 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d845ea-a98a-43ae-9803-30e5d306d29d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.812307 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.813965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.842521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.848795 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.848823 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.848834 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.891341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.899473 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-config-data" (OuterVolumeSpecName: "config-data") pod "15480433-b4c2-47c5-a7e4-73395b5bd27d" (UID: "15480433-b4c2-47c5-a7e4-73395b5bd27d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.909667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.933552 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.948856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-config-data" (OuterVolumeSpecName: "config-data") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.950013 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.950028 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.950038 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15480433-b4c2-47c5-a7e4-73395b5bd27d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.950047 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.950056 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.951863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:31 crc kubenswrapper[4749]: I0310 16:12:31.953794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data" (OuterVolumeSpecName: "config-data") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:31.989313 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data" (OuterVolumeSpecName: "config-data") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:31.993318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fd8a90f3-a6d3-428e-a049-78cb36e2ed34" (UID: "fd8a90f3-a6d3-428e-a049-78cb36e2ed34"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:31.998616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0d845ea-a98a-43ae-9803-30e5d306d29d" (UID: "a0d845ea-a98a-43ae-9803-30e5d306d29d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.010707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.012687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7cc64163-530a-4b31-9acc-84910336b781" (UID: "7cc64163-530a-4b31-9acc-84910336b781"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.053903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74pp\" (UniqueName: \"kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp\") pod \"keystone-0bf0-account-create-update-6sg9m\" (UID: \"9227a6fc-a568-4df2-be6f-10e0eeb154d1\") " pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054189 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054211 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054225 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc64163-530a-4b31-9acc-84910336b781-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054241 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054253 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054264 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d845ea-a98a-43ae-9803-30e5d306d29d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.054275 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd8a90f3-a6d3-428e-a049-78cb36e2ed34-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.054722 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.054809 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts podName:9227a6fc-a568-4df2-be6f-10e0eeb154d1 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:34.054777444 +0000 UTC m=+1451.176643131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts") pod "keystone-0bf0-account-create-update-6sg9m" (UID: "9227a6fc-a568-4df2-be6f-10e0eeb154d1") : configmap "openstack-scripts" not found Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.059727 4749 projected.go:194] Error preparing data for projected volume kube-api-access-z74pp for pod openstack/keystone-0bf0-account-create-update-6sg9m: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.060031 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp podName:9227a6fc-a568-4df2-be6f-10e0eeb154d1 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:34.059786182 +0000 UTC m=+1451.181651929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z74pp" (UniqueName: "kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp") pod "keystone-0bf0-account-create-update-6sg9m" (UID: "9227a6fc-a568-4df2-be6f-10e0eeb154d1") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.092765 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7 is running failed: container process not found" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.093131 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7 is running failed: container process not found" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.093436 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7 is running failed: container process not found" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.093504 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46e39f11-450f-43a3-ba72-7c3e8245e382" containerName="nova-scheduler-scheduler" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.109259 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132450 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-67dd78ff7-qfbxb"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132486 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-67dd78ff7-qfbxb"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132507 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132519 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132528 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132539 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132548 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132558 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132568 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-wptw6"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132578 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-wptw6"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132591 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.132600 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.148203 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.163253 4749 scope.go:117] "RemoveContainer" containerID="0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.163618 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f\": container with ID starting with 0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f not found: ID does not exist" containerID="0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.163657 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f"} err="failed to get container status \"0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f\": rpc error: code = NotFound desc = could not find container \"0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f\": container with ID starting with 0ee3f2375f085150303c2b6ba4607901034d43e9521515b17af0d05668a11f6f not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.163681 4749 scope.go:117] "RemoveContainer" containerID="dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.163977 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81\": container with ID starting with dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81 not found: ID does not exist" containerID="dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.163995 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81"} err="failed to get container status \"dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81\": rpc error: code = NotFound desc = could not find container \"dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81\": container with ID starting with dbe74dbc6755c8b2dcb5ec38c4e8f76c03ab0c1ef203635518779cb018814c81 not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.164007 4749 scope.go:117] "RemoveContainer" containerID="eb58db301501949b729719960a2218e0ade2ee5b0f920eebc51295d80357ecf6" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.210096 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.218784 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.223485 4749 scope.go:117] "RemoveContainer" containerID="7ba23112caa9623ef0f1361b165f1a5f82eb24b9e458cf8a3e1047489301781f" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.252985 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.256941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-logs\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-combined-ca-bundle\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257140 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-config-data\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-internal-tls-certs\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257196 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-httpd-run\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257217 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-scripts\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.257234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q55gw\" (UniqueName: \"kubernetes.io/projected/1b598099-b3f7-4157-8e5f-6eb472806511-kube-api-access-q55gw\") pod \"1b598099-b3f7-4157-8e5f-6eb472806511\" (UID: \"1b598099-b3f7-4157-8e5f-6eb472806511\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.259644 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-logs" (OuterVolumeSpecName: "logs") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.260021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.269003 4749 scope.go:117] "RemoveContainer" containerID="f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.269497 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.270630 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-lqrp8"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.271429 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-scripts" (OuterVolumeSpecName: "scripts") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.277755 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b598099-b3f7-4157-8e5f-6eb472806511-kube-api-access-q55gw" (OuterVolumeSpecName: "kube-api-access-q55gw") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "kube-api-access-q55gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.279252 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-88fa-account-create-update-lqrp8"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.281771 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.282786 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.285402 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.295298 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-116d-account-create-update-t69qj"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.306072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.322422 4749 scope.go:117] "RemoveContainer" containerID="0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360107 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-combined-ca-bundle\") pod \"ec710cfc-8539-47c5-8062-95911f973074\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-internal-tls-certs\") pod \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360191 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-memcached-tls-certs\") pod \"ec710cfc-8539-47c5-8062-95911f973074\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360210 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-public-tls-certs\") pod \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqj7c\" (UniqueName: \"kubernetes.io/projected/ec710cfc-8539-47c5-8062-95911f973074-kube-api-access-zqj7c\") pod \"ec710cfc-8539-47c5-8062-95911f973074\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg9q9\" (UniqueName: \"kubernetes.io/projected/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-kube-api-access-bg9q9\") pod \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-combined-ca-bundle\") pod \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-config-data\") pod \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.360869 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-kolla-config\") pod \"ec710cfc-8539-47c5-8062-95911f973074\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.361789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-config-data" (OuterVolumeSpecName: "config-data") pod "ec710cfc-8539-47c5-8062-95911f973074" (UID: "ec710cfc-8539-47c5-8062-95911f973074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.362607 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ec710cfc-8539-47c5-8062-95911f973074" (UID: "ec710cfc-8539-47c5-8062-95911f973074"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.363804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-config-data\") pod \"ec710cfc-8539-47c5-8062-95911f973074\" (UID: \"ec710cfc-8539-47c5-8062-95911f973074\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.363875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-certs\") pod \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.363927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-logs\") pod \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\" (UID: \"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.364900 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-logs" (OuterVolumeSpecName: "logs") pod "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" (UID: "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.367049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-config\") pod \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.367119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zmj4\" (UniqueName: \"kubernetes.io/projected/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-api-access-5zmj4\") pod \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.367144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-combined-ca-bundle\") pod \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\" (UID: \"0a8246b1-28b8-4eb6-83a3-1e87beecfb78\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370867 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370897 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370915 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370924 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370933 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q55gw\" (UniqueName: \"kubernetes.io/projected/1b598099-b3f7-4157-8e5f-6eb472806511-kube-api-access-q55gw\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370946 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b598099-b3f7-4157-8e5f-6eb472806511-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370955 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370964 4749 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.370975 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec710cfc-8539-47c5-8062-95911f973074-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.375566 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-116d-account-create-update-t69qj"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.377056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-api-access-5zmj4" (OuterVolumeSpecName: "kube-api-access-5zmj4") pod "0a8246b1-28b8-4eb6-83a3-1e87beecfb78" (UID: "0a8246b1-28b8-4eb6-83a3-1e87beecfb78"). InnerVolumeSpecName "kube-api-access-5zmj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.377091 4749 scope.go:117] "RemoveContainer" containerID="f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.377833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-kube-api-access-bg9q9" (OuterVolumeSpecName: "kube-api-access-bg9q9") pod "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" (UID: "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e"). InnerVolumeSpecName "kube-api-access-bg9q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.378268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec710cfc-8539-47c5-8062-95911f973074-kube-api-access-zqj7c" (OuterVolumeSpecName: "kube-api-access-zqj7c") pod "ec710cfc-8539-47c5-8062-95911f973074" (UID: "ec710cfc-8539-47c5-8062-95911f973074"). InnerVolumeSpecName "kube-api-access-zqj7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.378452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.383120 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e794ff07-5e05-4d6c-8cc6-64efd90fd91b/ovn-northd/0.log" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.384388 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.411521 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda\": container with ID starting with f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda not found: ID does not exist" containerID="f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.411585 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda"} err="failed to get container status \"f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda\": rpc error: code = NotFound desc = could not find container \"f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda\": container with ID starting with f8d45398159e5ac21928a6fe846afd80a890bf4e12f47c045bfc44f153806dda not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.411610 4749 scope.go:117] "RemoveContainer" containerID="0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.411814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec710cfc-8539-47c5-8062-95911f973074" (UID: "ec710cfc-8539-47c5-8062-95911f973074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.412000 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c\": container with ID starting with 0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c not found: ID does not exist" containerID="0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.412022 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c"} err="failed to get container status \"0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c\": rpc error: code = NotFound desc = could not find container \"0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c\": container with ID starting with 0b98dca184d3190b9e30a3a5bcef1c368b98ad5b0683d689cbed67e3d6d6fa5c not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.412035 4749 scope.go:117] "RemoveContainer" containerID="a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.424629 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a8246b1-28b8-4eb6-83a3-1e87beecfb78" (UID: "0a8246b1-28b8-4eb6-83a3-1e87beecfb78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.439309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" (UID: "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.446179 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.458833 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f7b864884-n5l5z"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.458924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-config-data" (OuterVolumeSpecName: "config-data") pod "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" (UID: "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.459816 4749 scope.go:117] "RemoveContainer" containerID="8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.467820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" (UID: "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.469547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "0a8246b1-28b8-4eb6-83a3-1e87beecfb78" (UID: "0a8246b1-28b8-4eb6-83a3-1e87beecfb78"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-config-data\") pod \"d61221be-c05f-47ae-a3b5-80f59d809281\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-config-data\") pod \"e80985ef-0a5d-403a-b351-c59bd878723d\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nzm2\" (UniqueName: \"kubernetes.io/projected/d61221be-c05f-47ae-a3b5-80f59d809281-kube-api-access-7nzm2\") pod \"d61221be-c05f-47ae-a3b5-80f59d809281\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-combined-ca-bundle\") pod \"e80985ef-0a5d-403a-b351-c59bd878723d\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472700 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4j2\" (UniqueName: \"kubernetes.io/projected/e80985ef-0a5d-403a-b351-c59bd878723d-kube-api-access-fz4j2\") pod \"e80985ef-0a5d-403a-b351-c59bd878723d\" (UID: \"e80985ef-0a5d-403a-b351-c59bd878723d\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5w9\" (UniqueName: \"kubernetes.io/projected/46e39f11-450f-43a3-ba72-7c3e8245e382-kube-api-access-bt5w9\") pod \"46e39f11-450f-43a3-ba72-7c3e8245e382\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-nova-metadata-tls-certs\") pod \"d61221be-c05f-47ae-a3b5-80f59d809281\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472757 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-config-data\") pod \"46e39f11-450f-43a3-ba72-7c3e8245e382\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-combined-ca-bundle\") pod \"46e39f11-450f-43a3-ba72-7c3e8245e382\" (UID: \"46e39f11-450f-43a3-ba72-7c3e8245e382\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61221be-c05f-47ae-a3b5-80f59d809281-logs\") pod \"d61221be-c05f-47ae-a3b5-80f59d809281\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.472861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-combined-ca-bundle\") pod \"d61221be-c05f-47ae-a3b5-80f59d809281\" (UID: \"d61221be-c05f-47ae-a3b5-80f59d809281\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473231 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zmj4\" (UniqueName: \"kubernetes.io/projected/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-api-access-5zmj4\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473248 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473256 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473265 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473273 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqj7c\" (UniqueName: \"kubernetes.io/projected/ec710cfc-8539-47c5-8062-95911f973074-kube-api-access-zqj7c\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473281 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg9q9\" (UniqueName: \"kubernetes.io/projected/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-kube-api-access-bg9q9\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473289 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473297 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473308 4749 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.473321 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.481457 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f7b864884-n5l5z"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.482592 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.482590 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d61221be-c05f-47ae-a3b5-80f59d809281-logs" (OuterVolumeSpecName: "logs") pod "d61221be-c05f-47ae-a3b5-80f59d809281" (UID: "d61221be-c05f-47ae-a3b5-80f59d809281"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.489491 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "0a8246b1-28b8-4eb6-83a3-1e87beecfb78" (UID: "0a8246b1-28b8-4eb6-83a3-1e87beecfb78"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.494829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61221be-c05f-47ae-a3b5-80f59d809281-kube-api-access-7nzm2" (OuterVolumeSpecName: "kube-api-access-7nzm2") pod "d61221be-c05f-47ae-a3b5-80f59d809281" (UID: "d61221be-c05f-47ae-a3b5-80f59d809281"). InnerVolumeSpecName "kube-api-access-7nzm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.494920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e39f11-450f-43a3-ba72-7c3e8245e382-kube-api-access-bt5w9" (OuterVolumeSpecName: "kube-api-access-bt5w9") pod "46e39f11-450f-43a3-ba72-7c3e8245e382" (UID: "46e39f11-450f-43a3-ba72-7c3e8245e382"). InnerVolumeSpecName "kube-api-access-bt5w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.504627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80985ef-0a5d-403a-b351-c59bd878723d-kube-api-access-fz4j2" (OuterVolumeSpecName: "kube-api-access-fz4j2") pod "e80985ef-0a5d-403a-b351-c59bd878723d" (UID: "e80985ef-0a5d-403a-b351-c59bd878723d"). InnerVolumeSpecName "kube-api-access-fz4j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.509570 4749 scope.go:117] "RemoveContainer" containerID="a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.509901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" (UID: "ba4333fd-3a72-41c7-a82a-448ed0ccfc1e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.510022 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9\": container with ID starting with a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9 not found: ID does not exist" containerID="a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.510051 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9"} err="failed to get container status \"a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9\": rpc error: code = NotFound desc = could not find container \"a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9\": container with ID starting with a5a9d70f8163b90e1531e0cff378ba14e371d6d2dacaf35b433db16489b0e4f9 not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.510068 4749 scope.go:117] "RemoveContainer" containerID="8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.510381 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee\": container with ID starting with 8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee not found: ID does not exist" containerID="8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.510403 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee"} err="failed to get container status \"8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee\": rpc error: code = NotFound desc = could not find container \"8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee\": container with ID starting with 8d6ca1fb9f59946d8771109218993126572d8059bdcaaf95b7b5954d7fd097ee not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.510425 4749 scope.go:117] "RemoveContainer" containerID="57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.510899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-config-data" (OuterVolumeSpecName: "config-data") pod "1b598099-b3f7-4157-8e5f-6eb472806511" (UID: "1b598099-b3f7-4157-8e5f-6eb472806511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.516088 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8dfcffcf6-962bk"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.518808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-config-data" (OuterVolumeSpecName: "config-data") pod "d61221be-c05f-47ae-a3b5-80f59d809281" (UID: "d61221be-c05f-47ae-a3b5-80f59d809281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.527464 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8dfcffcf6-962bk"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.527529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46e39f11-450f-43a3-ba72-7c3e8245e382" (UID: "46e39f11-450f-43a3-ba72-7c3e8245e382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.535789 4749 scope.go:117] "RemoveContainer" containerID="e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.549630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "ec710cfc-8539-47c5-8062-95911f973074" (UID: "ec710cfc-8539-47c5-8062-95911f973074"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.550365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e80985ef-0a5d-403a-b351-c59bd878723d" (UID: "e80985ef-0a5d-403a-b351-c59bd878723d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.556213 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-l8hzh"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.556757 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-config-data" (OuterVolumeSpecName: "config-data") pod "e80985ef-0a5d-403a-b351-c59bd878723d" (UID: "e80985ef-0a5d-403a-b351-c59bd878723d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.561650 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-939f-account-create-update-l8hzh"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.566923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d61221be-c05f-47ae-a3b5-80f59d809281" (UID: "d61221be-c05f-47ae-a3b5-80f59d809281"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.572532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-config-data" (OuterVolumeSpecName: "config-data") pod "46e39f11-450f-43a3-ba72-7c3e8245e382" (UID: "46e39f11-450f-43a3-ba72-7c3e8245e382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574051 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-scripts\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574202 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-combined-ca-bundle\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-config\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574275 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvbs2\" (UniqueName: \"kubernetes.io/projected/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-kube-api-access-tvbs2\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-metrics-certs-tls-certs\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574621 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-scripts" (OuterVolumeSpecName: "scripts") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.574955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-config" (OuterVolumeSpecName: "config") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.579096 4749 scope.go:117] "RemoveContainer" containerID="57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.579159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-kube-api-access-tvbs2" (OuterVolumeSpecName: "kube-api-access-tvbs2") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "kube-api-access-tvbs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.582512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-rundir\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.582564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-northd-tls-certs\") pod \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\" (UID: \"e794ff07-5e05-4d6c-8cc6-64efd90fd91b\") " Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583223 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583239 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583251 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nzm2\" (UniqueName: \"kubernetes.io/projected/d61221be-c05f-47ae-a3b5-80f59d809281-kube-api-access-7nzm2\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583265 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583276 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985ef-0a5d-403a-b351-c59bd878723d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583289 4749 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec710cfc-8539-47c5-8062-95911f973074-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583300 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583311 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4j2\" (UniqueName: \"kubernetes.io/projected/e80985ef-0a5d-403a-b351-c59bd878723d-kube-api-access-fz4j2\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583323 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt5w9\" (UniqueName: \"kubernetes.io/projected/46e39f11-450f-43a3-ba72-7c3e8245e382-kube-api-access-bt5w9\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583334 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583360 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583378 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e39f11-450f-43a3-ba72-7c3e8245e382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583389 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583402 4749 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8246b1-28b8-4eb6-83a3-1e87beecfb78-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583426 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583437 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvbs2\" (UniqueName: \"kubernetes.io/projected/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-kube-api-access-tvbs2\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583447 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61221be-c05f-47ae-a3b5-80f59d809281-logs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583456 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b598099-b3f7-4157-8e5f-6eb472806511-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.583446 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4\": container with ID starting with 57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4 not found: ID does not exist" containerID="57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583486 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4"} err="failed to get container status \"57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4\": rpc error: code = NotFound desc = could not find container \"57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4\": container with ID starting with 57585bb9b412ce5a752b8acb87716ffcfdbab4a41883a915954d36af9a0479b4 not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.583511 4749 scope.go:117] "RemoveContainer" containerID="e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.590741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.590819 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2\": container with ID starting with e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2 not found: ID does not exist" containerID="e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.590841 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2"} err="failed to get container status \"e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2\": rpc error: code = NotFound desc = could not find container \"e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2\": container with ID starting with e35a1f1f5541a5e016192c72f9089e80dd9f3fd2c9d8da246bcb6d412f4bd4c2 not found: ID does not exist" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.590965 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.601566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1b598099-b3f7-4157-8e5f-6eb472806511","Type":"ContainerDied","Data":"0bae8e0bcabe94ac8952147ff854cb9eafce4908323d9aad1306b5064a2e57e8"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.601612 4749 scope.go:117] "RemoveContainer" containerID="5d60d1aa5d24cbc57ff5075376de396d004cbb8a0f8a549e929e2a81a8d75bd4" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.601723 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.610929 4749 generic.go:334] "Generic (PLEG): container finished" podID="1ac12213-5bcb-465c-a6aa-fa9e8e97c290" containerID="f986eaa17ad6e58ad34352fe05a57ee439236a7aff45d41d3a9b77a7333e3439" exitCode=1 Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.611099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxvc" event={"ID":"1ac12213-5bcb-465c-a6aa-fa9e8e97c290","Type":"ContainerDied","Data":"f986eaa17ad6e58ad34352fe05a57ee439236a7aff45d41d3a9b77a7333e3439"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.633632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ba4333fd-3a72-41c7-a82a-448ed0ccfc1e","Type":"ContainerDied","Data":"9ae122f4c1054e59447954e132280768320cc528a2c9fe88938a016bff40cf7a"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.637739 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.639831 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.641328 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.641594 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.642509 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.642661 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.648040 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.649654 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:32 crc kubenswrapper[4749]: E0310 16:12:32.649710 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.651438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d61221be-c05f-47ae-a3b5-80f59d809281","Type":"ContainerDied","Data":"2ceea8f7d75fd1e2ab9b5b46bb69175d96646dea1f775db99e0e3db2a997c599"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.651521 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.657350 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e794ff07-5e05-4d6c-8cc6-64efd90fd91b/ovn-northd/0.log" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.657414 4749 generic.go:334] "Generic (PLEG): container finished" podID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerID="ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf" exitCode=139 Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.657470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e794ff07-5e05-4d6c-8cc6-64efd90fd91b","Type":"ContainerDied","Data":"ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.657496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e794ff07-5e05-4d6c-8cc6-64efd90fd91b","Type":"ContainerDied","Data":"0440e056788544c1f196ee262620191e228224c41ab8efd8625523c81eee7d1a"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.657556 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.660793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a8246b1-28b8-4eb6-83a3-1e87beecfb78","Type":"ContainerDied","Data":"5664a16b9bca18b3a86d87785a9e86c27c72f0ed8bbe7da6cc5d33b19f9a0194"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.660871 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.666865 4749 generic.go:334] "Generic (PLEG): container finished" podID="46e39f11-450f-43a3-ba72-7c3e8245e382" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" exitCode=0 Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.666915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46e39f11-450f-43a3-ba72-7c3e8245e382","Type":"ContainerDied","Data":"c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.666990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46e39f11-450f-43a3-ba72-7c3e8245e382","Type":"ContainerDied","Data":"1c2ee7b9d83ccd87d32de631e78711c1cca60bf2b3e1c17b69c66a5a0a5001ec"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.667143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.667996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.670280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e80985ef-0a5d-403a-b351-c59bd878723d","Type":"ContainerDied","Data":"d35a092ca12c6db810580c404110b4910cbd51eb33e17d8f5d197bbe075f83d6"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.670645 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.674243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ec710cfc-8539-47c5-8062-95911f973074","Type":"ContainerDied","Data":"3d7973dba48a4eaddfd69e15d25cfcd7275ce0526e923e2fae0f1ad5997b7d31"} Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.674320 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.679968 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8701-account-create-update-nnbrt" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.680000 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.680137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.682018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.686052 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.687339 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.687437 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.698526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d61221be-c05f-47ae-a3b5-80f59d809281" (UID: "d61221be-c05f-47ae-a3b5-80f59d809281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.723094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e794ff07-5e05-4d6c-8cc6-64efd90fd91b" (UID: "e794ff07-5e05-4d6c-8cc6-64efd90fd91b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.773151 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" probeResult="failure" output="command timed out" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.788944 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61221be-c05f-47ae-a3b5-80f59d809281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.788983 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e794ff07-5e05-4d6c-8cc6-64efd90fd91b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.803608 4749 scope.go:117] "RemoveContainer" containerID="cab2c2597fe3eedc75127b4143e5f1b6bbdc90ba2c1b1f74f9e373f1b0ed0f17" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.803894 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 16:12:32 crc kubenswrapper[4749]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Mar 10 16:12:32 crc kubenswrapper[4749]: > Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.840441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.862667 4749 scope.go:117] "RemoveContainer" containerID="98105fccfb28ffdfad3a0b356c5f8eb37b06bcf57c45dd3c31713662a31ddfe5" Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.867443 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.874863 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.900098 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.913203 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.921441 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.932599 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.946316 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:12:32 crc kubenswrapper[4749]: I0310 16:12:32.976629 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.010152 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8701-account-create-update-nnbrt"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.014120 4749 scope.go:117] "RemoveContainer" containerID="15200dc474647db1a4fe0a09c7e300067457404fd74cf484162cb0842079dff1" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.041886 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.048590 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.059305 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8701-account-create-update-nnbrt"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.060189 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.073149 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.073215 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerName="nova-cell0-conductor-conductor" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.100328 4749 scope.go:117] "RemoveContainer" containerID="223e73b546b611a128d4581fd3fab7f4ad5f58cffc7f3d629e05eb77a8f22f97" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.116764 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.134537 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.135789 4749 scope.go:117] "RemoveContainer" containerID="cd77f536ef94e68fc550ef2465958581ff6e48ae27050c48692c33b29d740bde" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.139875 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.147919 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.156934 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.169522 4749 scope.go:117] "RemoveContainer" containerID="38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.170615 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.177584 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.190677 4749 scope.go:117] "RemoveContainer" containerID="ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.193706 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.201735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-operator-scripts\") pod \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.201879 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcjc9\" (UniqueName: \"kubernetes.io/projected/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-kube-api-access-kcjc9\") pod \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\" (UID: \"1ac12213-5bcb-465c-a6aa-fa9e8e97c290\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.203681 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.203950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ac12213-5bcb-465c-a6aa-fa9e8e97c290" (UID: "1ac12213-5bcb-465c-a6aa-fa9e8e97c290"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.206768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-kube-api-access-kcjc9" (OuterVolumeSpecName: "kube-api-access-kcjc9") pod "1ac12213-5bcb-465c-a6aa-fa9e8e97c290" (UID: "1ac12213-5bcb-465c-a6aa-fa9e8e97c290"). InnerVolumeSpecName "kube-api-access-kcjc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.213365 4749 scope.go:117] "RemoveContainer" containerID="38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.213873 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81\": container with ID starting with 38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81 not found: ID does not exist" containerID="38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.213904 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81"} err="failed to get container status \"38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81\": rpc error: code = NotFound desc = could not find container \"38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81\": container with ID starting with 38f4b6694812060a48587ca6ffe71a2cddf74223e2fc0f5cf662287449dd9c81 not found: ID does not exist" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.213923 4749 scope.go:117] "RemoveContainer" containerID="ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.214276 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf\": container with ID starting with ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf not found: ID does not exist" containerID="ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.214298 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf"} err="failed to get container status \"ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf\": rpc error: code = NotFound desc = could not find container \"ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf\": container with ID starting with ca2716f51a54c46af5450d10ce7392124f1eeace51046ae8bee36f5a018d9fcf not found: ID does not exist" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.214311 4749 scope.go:117] "RemoveContainer" containerID="fc7c26df644965a11d66d3971b0f14d486ea00dfbfe753e9e176059c55e661ad" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.215319 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.222431 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.234269 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.245051 4749 scope.go:117] "RemoveContainer" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.258779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.266310 4749 scope.go:117] "RemoveContainer" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.266856 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7\": container with ID starting with c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7 not found: ID does not exist" containerID="c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.266887 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7"} err="failed to get container status \"c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7\": rpc error: code = NotFound desc = could not find container \"c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7\": container with ID starting with c94cc0aebb967a9c43fef75ac2d08afd653d0b68ce900f39b21067be7271dda7 not found: ID does not exist" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.266905 4749 scope.go:117] "RemoveContainer" containerID="c94663d92d50885e5f9777c2186efd9f1a68ab7dca303b557f1ab7d4547ae21e" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.303585 4749 scope.go:117] "RemoveContainer" containerID="62c6ec9ae9969d0a788db39c30b51835114c25e683ecb1bdacaa7737d2e96d89" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.304558 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcjc9\" (UniqueName: \"kubernetes.io/projected/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-kube-api-access-kcjc9\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.304579 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ac12213-5bcb-465c-a6aa-fa9e8e97c290-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-galera-tls-certs\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-default\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-combined-ca-bundle\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kolla-config\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-operator-scripts\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405696 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grgp7\" (UniqueName: \"kubernetes.io/projected/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kube-api-access-grgp7\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.405727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-generated\") pod \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\" (UID: \"2bf7c072-7f7d-4f94-98a5-023b069f0eab\") " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.407783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.407909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.408955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.409341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.412809 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kube-api-access-grgp7" (OuterVolumeSpecName: "kube-api-access-grgp7") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "kube-api-access-grgp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.433219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.441508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.461726 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2bf7c072-7f7d-4f94-98a5-023b069f0eab" (UID: "2bf7c072-7f7d-4f94-98a5-023b069f0eab"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.509021 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509088 4749 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.509132 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data podName:1feaa4c9-2cec-45a8-9106-5be885c26eae nodeName:}" failed. No retries permitted until 2026-03-10 16:12:41.509104932 +0000 UTC m=+1458.630970629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data") pod "rabbitmq-server-0" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae") : configmap "rabbitmq-config-data" not found Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509192 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509224 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509244 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grgp7\" (UniqueName: \"kubernetes.io/projected/2bf7c072-7f7d-4f94-98a5-023b069f0eab-kube-api-access-grgp7\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509258 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509270 4749 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509283 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bf7c072-7f7d-4f94-98a5-023b069f0eab-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.509296 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf7c072-7f7d-4f94-98a5-023b069f0eab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.532327 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.611063 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.624664 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8246b1-28b8-4eb6-83a3-1e87beecfb78" path="/var/lib/kubelet/pods/0a8246b1-28b8-4eb6-83a3-1e87beecfb78/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.625336 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="101ece94-d304-4797-a87d-e7fc8deb6199" path="/var/lib/kubelet/pods/101ece94-d304-4797-a87d-e7fc8deb6199/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.625966 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" path="/var/lib/kubelet/pods/15480433-b4c2-47c5-a7e4-73395b5bd27d/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.630419 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" path="/var/lib/kubelet/pods/1b598099-b3f7-4157-8e5f-6eb472806511/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.631488 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e39f11-450f-43a3-ba72-7c3e8245e382" path="/var/lib/kubelet/pods/46e39f11-450f-43a3-ba72-7c3e8245e382/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.631981 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615e021a-88a2-496f-81a4-46d70e40310d" path="/var/lib/kubelet/pods/615e021a-88a2-496f-81a4-46d70e40310d/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.632293 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783a67d8-9e22-4503-82a6-5f49fb50ee7b" path="/var/lib/kubelet/pods/783a67d8-9e22-4503-82a6-5f49fb50ee7b/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.633141 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc64163-530a-4b31-9acc-84910336b781" path="/var/lib/kubelet/pods/7cc64163-530a-4b31-9acc-84910336b781/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.633733 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852c97ea-349d-4262-b36c-2ef7aa81ae75" path="/var/lib/kubelet/pods/852c97ea-349d-4262-b36c-2ef7aa81ae75/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.634267 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0229a2-b07d-4baa-8b4c-a1c356e38679" path="/var/lib/kubelet/pods/8a0229a2-b07d-4baa-8b4c-a1c356e38679/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.636178 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99aedb1b-bca3-41ef-9399-4678f86ac87c" path="/var/lib/kubelet/pods/99aedb1b-bca3-41ef-9399-4678f86ac87c/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.637421 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" path="/var/lib/kubelet/pods/a0d845ea-a98a-43ae-9803-30e5d306d29d/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.639072 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3814c41-600a-4463-9695-e55c293ffead" path="/var/lib/kubelet/pods/b3814c41-600a-4463-9695-e55c293ffead/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.639782 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ba9db0-29a2-468a-ab78-871620e30790" path="/var/lib/kubelet/pods/b5ba9db0-29a2-468a-ab78-871620e30790/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.640435 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" path="/var/lib/kubelet/pods/ba4333fd-3a72-41c7-a82a-448ed0ccfc1e/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.642923 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ef97dd-388e-4fc7-82bf-908c61ca2fe2" path="/var/lib/kubelet/pods/d5ef97dd-388e-4fc7-82bf-908c61ca2fe2/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.643310 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" path="/var/lib/kubelet/pods/d61221be-c05f-47ae-a3b5-80f59d809281/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.643978 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" path="/var/lib/kubelet/pods/e794ff07-5e05-4d6c-8cc6-64efd90fd91b/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.646047 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80985ef-0a5d-403a-b351-c59bd878723d" path="/var/lib/kubelet/pods/e80985ef-0a5d-403a-b351-c59bd878723d/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.646539 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec710cfc-8539-47c5-8062-95911f973074" path="/var/lib/kubelet/pods/ec710cfc-8539-47c5-8062-95911f973074/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.647037 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" path="/var/lib/kubelet/pods/fd8a90f3-a6d3-428e-a049-78cb36e2ed34/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.648130 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb87bc1-b9a8-44e7-8603-ba656ef9e65c" path="/var/lib/kubelet/pods/feb87bc1-b9a8-44e7-8603-ba656ef9e65c/volumes" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.706827 4749 generic.go:334] "Generic (PLEG): container finished" podID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerID="cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73" exitCode=0 Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.706851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2bf7c072-7f7d-4f94-98a5-023b069f0eab","Type":"ContainerDied","Data":"cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73"} Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.706880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2bf7c072-7f7d-4f94-98a5-023b069f0eab","Type":"ContainerDied","Data":"3619a258f327445043c95e77610d928b2d22f732084461c84d8732ffa496a5b9"} Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.706832 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.706896 4749 scope.go:117] "RemoveContainer" containerID="cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.732156 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.737762 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.740569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxvc" event={"ID":"1ac12213-5bcb-465c-a6aa-fa9e8e97c290","Type":"ContainerDied","Data":"5ab5b8a23a717e3d7583ae90ab05cc659a2772d63ab4a793dc298238452ec013"} Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.740651 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxvc" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.751969 4749 scope.go:117] "RemoveContainer" containerID="1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.752242 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7637a97-25f4-4696-a41c-545d0d6b0e9a" containerID="261594765d431b29d11923174d8f5b406353566732ec1cf061a23975229933f0" exitCode=0 Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.752322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0bf0-account-create-update-6sg9m" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.752992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d6cd8c57d-9v7dx" event={"ID":"a7637a97-25f4-4696-a41c-545d0d6b0e9a","Type":"ContainerDied","Data":"261594765d431b29d11923174d8f5b406353566732ec1cf061a23975229933f0"} Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.761359 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vvxvc"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.797431 4749 scope.go:117] "RemoveContainer" containerID="cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.798668 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73\": container with ID starting with cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73 not found: ID does not exist" containerID="cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.798695 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73"} err="failed to get container status \"cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73\": rpc error: code = NotFound desc = could not find container \"cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73\": container with ID starting with cc722bd2c5f210c8bfbab86f5f74d73900920f0d1ede31e8eea776e399075b73 not found: ID does not exist" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.798772 4749 scope.go:117] "RemoveContainer" containerID="1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5" Mar 10 16:12:33 crc kubenswrapper[4749]: E0310 16:12:33.799031 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5\": container with ID starting with 1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5 not found: ID does not exist" containerID="1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.799058 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5"} err="failed to get container status \"1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5\": rpc error: code = NotFound desc = could not find container \"1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5\": container with ID starting with 1f04605d6e62a2abd9147c5b7ff8f13e35fb6d4061a18f7248ff60266f1c39c5 not found: ID does not exist" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.799071 4749 scope.go:117] "RemoveContainer" containerID="f986eaa17ad6e58ad34352fe05a57ee439236a7aff45d41d3a9b77a7333e3439" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.799230 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vvxvc"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.814280 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0bf0-account-create-update-6sg9m"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.818840 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0bf0-account-create-update-6sg9m"] Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.859342 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57db588689-ff8h6" podUID="dafd71a4-7276-4bce-84d9-6568e9d38d9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.203:5353: i/o timeout" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.916173 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9227a6fc-a568-4df2-be6f-10e0eeb154d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:33 crc kubenswrapper[4749]: I0310 16:12:33.916460 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z74pp\" (UniqueName: \"kubernetes.io/projected/9227a6fc-a568-4df2-be6f-10e0eeb154d1-kube-api-access-z74pp\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.113324 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.227923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-public-tls-certs\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.227982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc74m\" (UniqueName: \"kubernetes.io/projected/a7637a97-25f4-4696-a41c-545d0d6b0e9a-kube-api-access-dc74m\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.228015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-internal-tls-certs\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.228117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-fernet-keys\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.228138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-combined-ca-bundle\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.228181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-credential-keys\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.228224 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-config-data\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.228240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-scripts\") pod \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\" (UID: \"a7637a97-25f4-4696-a41c-545d0d6b0e9a\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.234619 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-scripts" (OuterVolumeSpecName: "scripts") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.234992 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.235513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7637a97-25f4-4696-a41c-545d0d6b0e9a-kube-api-access-dc74m" (OuterVolumeSpecName: "kube-api-access-dc74m") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "kube-api-access-dc74m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.235599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.259933 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.268974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-config-data" (OuterVolumeSpecName: "config-data") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.273641 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.277577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7637a97-25f4-4696-a41c-545d0d6b0e9a" (UID: "a7637a97-25f4-4696-a41c-545d0d6b0e9a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.319203 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329774 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329814 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329825 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329834 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329843 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329851 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329861 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc74m\" (UniqueName: \"kubernetes.io/projected/a7637a97-25f4-4696-a41c-545d0d6b0e9a-kube-api-access-dc74m\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.329891 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7637a97-25f4-4696-a41c-545d0d6b0e9a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-plugins\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-server-conf\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-erlang-cookie\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430918 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-plugins-conf\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.430996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-pod-info\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.431025 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnnrs\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-kube-api-access-wnnrs\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.431104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-tls\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.431159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-erlang-cookie-secret\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.431189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-confd\") pod \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\" (UID: \"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.431873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.432174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.432845 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.432866 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.434684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.438600 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-pod-info" (OuterVolumeSpecName: "pod-info") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.439416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.439534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.441543 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.442854 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-kube-api-access-wnnrs" (OuterVolumeSpecName: "kube-api-access-wnnrs") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "kube-api-access-wnnrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.454496 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data" (OuterVolumeSpecName: "config-data") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.481335 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.488236 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-server-conf" (OuterVolumeSpecName: "server-conf") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540071 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540106 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540120 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnnrs\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-kube-api-access-wnnrs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540131 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540142 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540201 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540213 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.540226 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.555752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" (UID: "d34f67ec-ba88-43c9-84af-2c59a2dbbbe3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.559456 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1feaa4c9-2cec-45a8-9106-5be885c26eae-pod-info\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-tls\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641776 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpfqg\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-kube-api-access-lpfqg\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-plugins\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1feaa4c9-2cec-45a8-9106-5be885c26eae-erlang-cookie-secret\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-server-conf\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-confd\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.641997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.642013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-plugins-conf\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.642036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-erlang-cookie\") pod \"1feaa4c9-2cec-45a8-9106-5be885c26eae\" (UID: \"1feaa4c9-2cec-45a8-9106-5be885c26eae\") " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.642287 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.642302 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.642736 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.647228 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.647910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1feaa4c9-2cec-45a8-9106-5be885c26eae-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.648126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1feaa4c9-2cec-45a8-9106-5be885c26eae-pod-info" (OuterVolumeSpecName: "pod-info") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.648295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.649608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.650681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.651297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-kube-api-access-lpfqg" (OuterVolumeSpecName: "kube-api-access-lpfqg") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "kube-api-access-lpfqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.665308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data" (OuterVolumeSpecName: "config-data") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.688151 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-server-conf" (OuterVolumeSpecName: "server-conf") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.726871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1feaa4c9-2cec-45a8-9106-5be885c26eae" (UID: "1feaa4c9-2cec-45a8-9106-5be885c26eae"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743900 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1feaa4c9-2cec-45a8-9106-5be885c26eae-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743928 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743936 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743957 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743965 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743973 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743982 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1feaa4c9-2cec-45a8-9106-5be885c26eae-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743990 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.743997 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1feaa4c9-2cec-45a8-9106-5be885c26eae-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.744005 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpfqg\" (UniqueName: \"kubernetes.io/projected/1feaa4c9-2cec-45a8-9106-5be885c26eae-kube-api-access-lpfqg\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.744013 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1feaa4c9-2cec-45a8-9106-5be885c26eae-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.758646 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.762356 4749 generic.go:334] "Generic (PLEG): container finished" podID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerID="26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d" exitCode=0 Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.762488 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.763458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1feaa4c9-2cec-45a8-9106-5be885c26eae","Type":"ContainerDied","Data":"26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d"} Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.763488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1feaa4c9-2cec-45a8-9106-5be885c26eae","Type":"ContainerDied","Data":"810f01f8a2436913223fcdb81673d0300b3e2ff151414c944c97f8e18c53d3b3"} Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.763501 4749 scope.go:117] "RemoveContainer" containerID="26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.766799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d6cd8c57d-9v7dx" event={"ID":"a7637a97-25f4-4696-a41c-545d0d6b0e9a","Type":"ContainerDied","Data":"389c8bdcf32aecbae0e9690425b24528a8e6771107e0534249c66451d5d9a23f"} Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.766891 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d6cd8c57d-9v7dx" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.779740 4749 generic.go:334] "Generic (PLEG): container finished" podID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerID="3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a" exitCode=0 Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.779942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3","Type":"ContainerDied","Data":"3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a"} Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.780111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d34f67ec-ba88-43c9-84af-2c59a2dbbbe3","Type":"ContainerDied","Data":"5ba4d58986e74184358b23a5e61f3e47301a3c7f6dae3924926cf282d77f215d"} Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.780192 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.794779 4749 scope.go:117] "RemoveContainer" containerID="367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.819515 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d6cd8c57d-9v7dx"] Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.822642 4749 scope.go:117] "RemoveContainer" containerID="26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d" Mar 10 16:12:34 crc kubenswrapper[4749]: E0310 16:12:34.826094 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d\": container with ID starting with 26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d not found: ID does not exist" containerID="26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.826139 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d"} err="failed to get container status \"26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d\": rpc error: code = NotFound desc = could not find container \"26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d\": container with ID starting with 26814de58b1f416e7e0be2cfe89690c8b3811cee361fb2844178ac8832bae25d not found: ID does not exist" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.826174 4749 scope.go:117] "RemoveContainer" containerID="367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802" Mar 10 16:12:34 crc kubenswrapper[4749]: E0310 16:12:34.826549 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802\": container with ID starting with 367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802 not found: ID does not exist" containerID="367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.826587 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802"} err="failed to get container status \"367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802\": rpc error: code = NotFound desc = could not find container \"367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802\": container with ID starting with 367802e57c3ed96a7c16df84afe14689df4644175e58ac2036c8eabd7a974802 not found: ID does not exist" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.826614 4749 scope.go:117] "RemoveContainer" containerID="261594765d431b29d11923174d8f5b406353566732ec1cf061a23975229933f0" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.834693 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d6cd8c57d-9v7dx"] Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.842205 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.846189 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.870559 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.895037 4749 scope.go:117] "RemoveContainer" containerID="3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.898588 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.916723 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.918954 4749 scope.go:117] "RemoveContainer" containerID="743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52" Mar 10 16:12:34 crc kubenswrapper[4749]: E0310 16:12:34.932655 4749 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 16:12:34 crc kubenswrapper[4749]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-10T16:12:27Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 16:12:34 crc kubenswrapper[4749]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 16:12:34 crc kubenswrapper[4749]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-vms4g" message=< Mar 10 16:12:34 crc kubenswrapper[4749]: Exiting ovn-controller (1) [FAILED] Mar 10 16:12:34 crc kubenswrapper[4749]: Killing ovn-controller (1) [ OK ] Mar 10 16:12:34 crc kubenswrapper[4749]: Killing ovn-controller (1) with SIGKILL [ OK ] Mar 10 16:12:34 crc kubenswrapper[4749]: 2026-03-10T16:12:27Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 16:12:34 crc kubenswrapper[4749]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 16:12:34 crc kubenswrapper[4749]: > Mar 10 16:12:34 crc kubenswrapper[4749]: E0310 16:12:34.932703 4749 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 16:12:34 crc kubenswrapper[4749]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-10T16:12:27Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 10 16:12:34 crc kubenswrapper[4749]: /etc/init.d/functions: line 589: 400 Alarm clock "$@" Mar 10 16:12:34 crc kubenswrapper[4749]: > pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" containerID="cri-o://06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.932767 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-vms4g" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" containerID="cri-o://06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b" gracePeriod=22 Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.941243 4749 scope.go:117] "RemoveContainer" containerID="3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a" Mar 10 16:12:34 crc kubenswrapper[4749]: E0310 16:12:34.941716 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a\": container with ID starting with 3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a not found: ID does not exist" containerID="3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.941746 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a"} err="failed to get container status \"3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a\": rpc error: code = NotFound desc = could not find container \"3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a\": container with ID starting with 3755df7d0a3f21329c48cc7cedfea9c0673b59bab2514f03a809161f3ed9250a not found: ID does not exist" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.941765 4749 scope.go:117] "RemoveContainer" containerID="743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52" Mar 10 16:12:34 crc kubenswrapper[4749]: E0310 16:12:34.942125 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52\": container with ID starting with 743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52 not found: ID does not exist" containerID="743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52" Mar 10 16:12:34 crc kubenswrapper[4749]: I0310 16:12:34.942148 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52"} err="failed to get container status \"743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52\": rpc error: code = NotFound desc = could not find container \"743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52\": container with ID starting with 743b0f5a36ebdb07c835bb35540b750bb909b97412f42ebd1ec0fb4999abbe52 not found: ID does not exist" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.236340 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vms4g_0ad2c472-e0a5-43d7-971e-a242a578042b/ovn-controller/0.log" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.236417 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-ovn-controller-tls-certs\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-combined-ca-bundle\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run" (OuterVolumeSpecName: "var-run") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6ghk\" (UniqueName: \"kubernetes.io/projected/0ad2c472-e0a5-43d7-971e-a242a578042b-kube-api-access-g6ghk\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356529 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-log-ovn\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad2c472-e0a5-43d7-971e-a242a578042b-scripts\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run-ovn\") pod \"0ad2c472-e0a5-43d7-971e-a242a578042b\" (UID: \"0ad2c472-e0a5-43d7-971e-a242a578042b\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.356873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.357715 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.357773 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.357796 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0ad2c472-e0a5-43d7-971e-a242a578042b-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.358080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad2c472-e0a5-43d7-971e-a242a578042b-scripts" (OuterVolumeSpecName: "scripts") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.361804 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad2c472-e0a5-43d7-971e-a242a578042b-kube-api-access-g6ghk" (OuterVolumeSpecName: "kube-api-access-g6ghk") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "kube-api-access-g6ghk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.382866 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.450518 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "0ad2c472-e0a5-43d7-971e-a242a578042b" (UID: "0ad2c472-e0a5-43d7-971e-a242a578042b"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.459139 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6ghk\" (UniqueName: \"kubernetes.io/projected/0ad2c472-e0a5-43d7-971e-a242a578042b-kube-api-access-g6ghk\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.459176 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ad2c472-e0a5-43d7-971e-a242a578042b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.459188 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.459200 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad2c472-e0a5-43d7-971e-a242a578042b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.524900 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.614503 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac12213-5bcb-465c-a6aa-fa9e8e97c290" path="/var/lib/kubelet/pods/1ac12213-5bcb-465c-a6aa-fa9e8e97c290/volumes" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.615204 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" path="/var/lib/kubelet/pods/1feaa4c9-2cec-45a8-9106-5be885c26eae/volumes" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.615855 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" path="/var/lib/kubelet/pods/2bf7c072-7f7d-4f94-98a5-023b069f0eab/volumes" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.616847 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9227a6fc-a568-4df2-be6f-10e0eeb154d1" path="/var/lib/kubelet/pods/9227a6fc-a568-4df2-be6f-10e0eeb154d1/volumes" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.617213 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7637a97-25f4-4696-a41c-545d0d6b0e9a" path="/var/lib/kubelet/pods/a7637a97-25f4-4696-a41c-545d0d6b0e9a/volumes" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.618035 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" path="/var/lib/kubelet/pods/d34f67ec-ba88-43c9-84af-2c59a2dbbbe3/volumes" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.661423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-config-data\") pod \"c31b4d97-4ea8-411f-873a-1ad6c133b917\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.661521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25n8c\" (UniqueName: \"kubernetes.io/projected/c31b4d97-4ea8-411f-873a-1ad6c133b917-kube-api-access-25n8c\") pod \"c31b4d97-4ea8-411f-873a-1ad6c133b917\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.661623 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle\") pod \"c31b4d97-4ea8-411f-873a-1ad6c133b917\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.665423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31b4d97-4ea8-411f-873a-1ad6c133b917-kube-api-access-25n8c" (OuterVolumeSpecName: "kube-api-access-25n8c") pod "c31b4d97-4ea8-411f-873a-1ad6c133b917" (UID: "c31b4d97-4ea8-411f-873a-1ad6c133b917"). InnerVolumeSpecName "kube-api-access-25n8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: E0310 16:12:35.676805 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle podName:c31b4d97-4ea8-411f-873a-1ad6c133b917 nodeName:}" failed. No retries permitted until 2026-03-10 16:12:36.176768783 +0000 UTC m=+1453.298634470 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle") pod "c31b4d97-4ea8-411f-873a-1ad6c133b917" (UID: "c31b4d97-4ea8-411f-873a-1ad6c133b917") : error deleting /var/lib/kubelet/pods/c31b4d97-4ea8-411f-873a-1ad6c133b917/volume-subpaths: remove /var/lib/kubelet/pods/c31b4d97-4ea8-411f-873a-1ad6c133b917/volume-subpaths: no such file or directory Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.678642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-config-data" (OuterVolumeSpecName: "config-data") pod "c31b4d97-4ea8-411f-873a-1ad6c133b917" (UID: "c31b4d97-4ea8-411f-873a-1ad6c133b917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.744677 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-658447d949-bwfgt" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.168:9696/\": dial tcp 10.217.0.168:9696: connect: connection refused" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.765847 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.766023 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25n8c\" (UniqueName: \"kubernetes.io/projected/c31b4d97-4ea8-411f-873a-1ad6c133b917-kube-api-access-25n8c\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.805318 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vms4g_0ad2c472-e0a5-43d7-971e-a242a578042b/ovn-controller/0.log" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.805376 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerID="06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b" exitCode=137 Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.805503 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g" event={"ID":"0ad2c472-e0a5-43d7-971e-a242a578042b","Type":"ContainerDied","Data":"06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b"} Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.805554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vms4g" event={"ID":"0ad2c472-e0a5-43d7-971e-a242a578042b","Type":"ContainerDied","Data":"a3b56753cfc0f188dd80527d7164c9264250815e4652e0fc6a9067ef00b12562"} Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.805576 4749 scope.go:117] "RemoveContainer" containerID="06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.805759 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vms4g" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.817656 4749 generic.go:334] "Generic (PLEG): container finished" podID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" exitCode=0 Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.817702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31b4d97-4ea8-411f-873a-1ad6c133b917","Type":"ContainerDied","Data":"f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5"} Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.817722 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.817734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31b4d97-4ea8-411f-873a-1ad6c133b917","Type":"ContainerDied","Data":"ca2f6719329181c4c75d1619750a38a62fa9939486405a0d591f04e0ab04fa8f"} Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.836329 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vms4g"] Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.844300 4749 scope.go:117] "RemoveContainer" containerID="06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b" Mar 10 16:12:35 crc kubenswrapper[4749]: E0310 16:12:35.844889 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b\": container with ID starting with 06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b not found: ID does not exist" containerID="06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.844927 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b"} err="failed to get container status \"06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b\": rpc error: code = NotFound desc = could not find container \"06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b\": container with ID starting with 06f9586c9a8464b0b76b1390597145a85c34bac23df52f62d5fa76c48c45d34b not found: ID does not exist" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.844951 4749 scope.go:117] "RemoveContainer" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.847099 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vms4g"] Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.864819 4749 scope.go:117] "RemoveContainer" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" Mar 10 16:12:35 crc kubenswrapper[4749]: E0310 16:12:35.865263 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5\": container with ID starting with f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5 not found: ID does not exist" containerID="f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5" Mar 10 16:12:35 crc kubenswrapper[4749]: I0310 16:12:35.865293 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5"} err="failed to get container status \"f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5\": rpc error: code = NotFound desc = could not find container \"f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5\": container with ID starting with f023c3d478245ed45379b1f05f58d8f7db531f7ae58c1eb011fa680076b0e4c5 not found: ID does not exist" Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.273615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle\") pod \"c31b4d97-4ea8-411f-873a-1ad6c133b917\" (UID: \"c31b4d97-4ea8-411f-873a-1ad6c133b917\") " Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.276519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c31b4d97-4ea8-411f-873a-1ad6c133b917" (UID: "c31b4d97-4ea8-411f-873a-1ad6c133b917"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.375527 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31b4d97-4ea8-411f-873a-1ad6c133b917-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.455923 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.464041 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.829480 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerID="fd96a7ab3a64b263ae86578a46b4ac785d6b725cfd0504260b8b86b6c6c66caa" exitCode=0 Mar 10 16:12:36 crc kubenswrapper[4749]: I0310 16:12:36.829547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerDied","Data":"fd96a7ab3a64b263ae86578a46b4ac785d6b725cfd0504260b8b86b6c6c66caa"} Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.126197 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.127777 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.128085 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.295795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-log-httpd\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.295860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-config-data\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.295887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nmbw\" (UniqueName: \"kubernetes.io/projected/3e3d73b4-812e-4486-8467-87c6dfd6ee92-kube-api-access-8nmbw\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.295915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-ceilometer-tls-certs\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.295973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-combined-ca-bundle\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.296014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-scripts\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.296092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-sg-core-conf-yaml\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.296137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-run-httpd\") pod \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\" (UID: \"3e3d73b4-812e-4486-8467-87c6dfd6ee92\") " Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.296723 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.296939 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.308741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3d73b4-812e-4486-8467-87c6dfd6ee92-kube-api-access-8nmbw" (OuterVolumeSpecName: "kube-api-access-8nmbw") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "kube-api-access-8nmbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.309927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-scripts" (OuterVolumeSpecName: "scripts") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.323314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.336700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.355567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.369365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-config-data" (OuterVolumeSpecName: "config-data") pod "3e3d73b4-812e-4486-8467-87c6dfd6ee92" (UID: "3e3d73b4-812e-4486-8467-87c6dfd6ee92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398553 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398645 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398665 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e3d73b4-812e-4486-8467-87c6dfd6ee92-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398683 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398742 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nmbw\" (UniqueName: \"kubernetes.io/projected/3e3d73b4-812e-4486-8467-87c6dfd6ee92-kube-api-access-8nmbw\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398761 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398778 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.398831 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3d73b4-812e-4486-8467-87c6dfd6ee92-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.619191 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" path="/var/lib/kubelet/pods/0ad2c472-e0a5-43d7-971e-a242a578042b/volumes" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.621475 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" path="/var/lib/kubelet/pods/c31b4d97-4ea8-411f-873a-1ad6c133b917/volumes" Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.641646 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.642128 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.642757 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.642802 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.643415 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.645737 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.647230 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:37 crc kubenswrapper[4749]: E0310 16:12:37.647284 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.843714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e3d73b4-812e-4486-8467-87c6dfd6ee92","Type":"ContainerDied","Data":"d86b342f87896d3ca2cc4a25ea6baa6a5cf0675797fcb0f0b31cc0e4cc67ad8c"} Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.843761 4749 scope.go:117] "RemoveContainer" containerID="d4163978450ae5a28c7305f78e151c8b39face70e710fef1aa4e65399f74f360" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.843877 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.876882 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.882777 4749 scope.go:117] "RemoveContainer" containerID="a7e54b42d006c7f4c24dab0c52ee76e67b32f801f6a37cf60527b10f12948e8b" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.883441 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.898901 4749 scope.go:117] "RemoveContainer" containerID="fd96a7ab3a64b263ae86578a46b4ac785d6b725cfd0504260b8b86b6c6c66caa" Mar 10 16:12:37 crc kubenswrapper[4749]: I0310 16:12:37.914471 4749 scope.go:117] "RemoveContainer" containerID="595e6a774c6f1fb6971d897ceee5714bc8b70476939e4b04c3cdfc26a133bd65" Mar 10 16:12:39 crc kubenswrapper[4749]: I0310 16:12:39.619558 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" path="/var/lib/kubelet/pods/3e3d73b4-812e-4486-8467-87c6dfd6ee92/volumes" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.494801 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-httpd-config\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-internal-tls-certs\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-config\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579264 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhf8j\" (UniqueName: \"kubernetes.io/projected/236aa9f6-5238-45de-813d-e0b18c887f64-kube-api-access-dhf8j\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579397 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-public-tls-certs\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579449 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-ovndb-tls-certs\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.579469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-combined-ca-bundle\") pod \"236aa9f6-5238-45de-813d-e0b18c887f64\" (UID: \"236aa9f6-5238-45de-813d-e0b18c887f64\") " Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.585121 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236aa9f6-5238-45de-813d-e0b18c887f64-kube-api-access-dhf8j" (OuterVolumeSpecName: "kube-api-access-dhf8j") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "kube-api-access-dhf8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.585911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.620423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-config" (OuterVolumeSpecName: "config") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.623889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.627803 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.633873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.640276 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.643652 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.643981 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.644011 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.646637 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.648586 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.650019 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.650062 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.662765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "236aa9f6-5238-45de-813d-e0b18c887f64" (UID: "236aa9f6-5238-45de-813d-e0b18c887f64"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.681484 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.681783 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.681892 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.681995 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.682105 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.682212 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/236aa9f6-5238-45de-813d-e0b18c887f64-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.682297 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhf8j\" (UniqueName: \"kubernetes.io/projected/236aa9f6-5238-45de-813d-e0b18c887f64-kube-api-access-dhf8j\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.900576 4749 generic.go:334] "Generic (PLEG): container finished" podID="236aa9f6-5238-45de-813d-e0b18c887f64" containerID="1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73" exitCode=0 Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.900681 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-658447d949-bwfgt" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.900689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-658447d949-bwfgt" event={"ID":"236aa9f6-5238-45de-813d-e0b18c887f64","Type":"ContainerDied","Data":"1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73"} Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.901478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-658447d949-bwfgt" event={"ID":"236aa9f6-5238-45de-813d-e0b18c887f64","Type":"ContainerDied","Data":"479b4915e2f2b9054cc92ad9b324f5c20b4c5f07585a3ff0f7aa610a89ff5dbb"} Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.901515 4749 scope.go:117] "RemoveContainer" containerID="44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.934364 4749 scope.go:117] "RemoveContainer" containerID="1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.951974 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-658447d949-bwfgt"] Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.960571 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-658447d949-bwfgt"] Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.968090 4749 scope.go:117] "RemoveContainer" containerID="44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6" Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.968982 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6\": container with ID starting with 44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6 not found: ID does not exist" containerID="44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.969036 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6"} err="failed to get container status \"44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6\": rpc error: code = NotFound desc = could not find container \"44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6\": container with ID starting with 44cd16ebefec8b032bd832bb6a1686dd3509d76e1b509d8e93cab3ba9ee33de6 not found: ID does not exist" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.969083 4749 scope.go:117] "RemoveContainer" containerID="1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73" Mar 10 16:12:42 crc kubenswrapper[4749]: E0310 16:12:42.970023 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73\": container with ID starting with 1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73 not found: ID does not exist" containerID="1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73" Mar 10 16:12:42 crc kubenswrapper[4749]: I0310 16:12:42.970065 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73"} err="failed to get container status \"1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73\": rpc error: code = NotFound desc = could not find container \"1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73\": container with ID starting with 1ef238a15dbaf6a5a3c10ac3efb6186c6bcbe89603f578e61204e28180d61d73 not found: ID does not exist" Mar 10 16:12:43 crc kubenswrapper[4749]: I0310 16:12:43.632602 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" path="/var/lib/kubelet/pods/236aa9f6-5238-45de-813d-e0b18c887f64/volumes" Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.640232 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.641312 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.641810 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.641859 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.642180 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.644164 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.647594 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:47 crc kubenswrapper[4749]: E0310 16:12:47.648815 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.641353 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.642714 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.643260 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.643452 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.643491 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.644820 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.646057 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 16:12:52 crc kubenswrapper[4749]: E0310 16:12:52.646104 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-bd2hf" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.040952 4749 generic.go:334] "Generic (PLEG): container finished" podID="01351004-ea7d-4973-9dd2-859022a35edb" containerID="979fafe5fb14a8e96ba3c95974f251f5e9ed6197a8ab83b091aa994aadb744a2" exitCode=137 Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.041011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"01351004-ea7d-4973-9dd2-859022a35edb","Type":"ContainerDied","Data":"979fafe5fb14a8e96ba3c95974f251f5e9ed6197a8ab83b091aa994aadb744a2"} Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.041511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"01351004-ea7d-4973-9dd2-859022a35edb","Type":"ContainerDied","Data":"b33eca7bbd7ef710fb918f73760b72855c719ff855e7100356c5ad12cefc8dc0"} Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.041528 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33eca7bbd7ef710fb918f73760b72855c719ff855e7100356c5ad12cefc8dc0" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.253886 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.410255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data\") pod \"01351004-ea7d-4973-9dd2-859022a35edb\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.410391 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-combined-ca-bundle\") pod \"01351004-ea7d-4973-9dd2-859022a35edb\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.410452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data-custom\") pod \"01351004-ea7d-4973-9dd2-859022a35edb\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.410477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01351004-ea7d-4973-9dd2-859022a35edb-etc-machine-id\") pod \"01351004-ea7d-4973-9dd2-859022a35edb\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.410510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwmjd\" (UniqueName: \"kubernetes.io/projected/01351004-ea7d-4973-9dd2-859022a35edb-kube-api-access-mwmjd\") pod \"01351004-ea7d-4973-9dd2-859022a35edb\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.410544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-scripts\") pod \"01351004-ea7d-4973-9dd2-859022a35edb\" (UID: \"01351004-ea7d-4973-9dd2-859022a35edb\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.411120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01351004-ea7d-4973-9dd2-859022a35edb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "01351004-ea7d-4973-9dd2-859022a35edb" (UID: "01351004-ea7d-4973-9dd2-859022a35edb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.417059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-scripts" (OuterVolumeSpecName: "scripts") pod "01351004-ea7d-4973-9dd2-859022a35edb" (UID: "01351004-ea7d-4973-9dd2-859022a35edb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.417890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01351004-ea7d-4973-9dd2-859022a35edb-kube-api-access-mwmjd" (OuterVolumeSpecName: "kube-api-access-mwmjd") pod "01351004-ea7d-4973-9dd2-859022a35edb" (UID: "01351004-ea7d-4973-9dd2-859022a35edb"). InnerVolumeSpecName "kube-api-access-mwmjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.422863 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.424359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "01351004-ea7d-4973-9dd2-859022a35edb" (UID: "01351004-ea7d-4973-9dd2-859022a35edb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.474047 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01351004-ea7d-4973-9dd2-859022a35edb" (UID: "01351004-ea7d-4973-9dd2-859022a35edb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.495382 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data" (OuterVolumeSpecName: "config-data") pod "01351004-ea7d-4973-9dd2-859022a35edb" (UID: "01351004-ea7d-4973-9dd2-859022a35edb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.511848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-cache\") pod \"85d50314-7d2d-4d92-9a78-846a573a3000\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.511953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"85d50314-7d2d-4d92-9a78-846a573a3000\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcbxq\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-kube-api-access-zcbxq\") pod \"85d50314-7d2d-4d92-9a78-846a573a3000\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") pod \"85d50314-7d2d-4d92-9a78-846a573a3000\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d50314-7d2d-4d92-9a78-846a573a3000-combined-ca-bundle\") pod \"85d50314-7d2d-4d92-9a78-846a573a3000\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512336 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-lock\") pod \"85d50314-7d2d-4d92-9a78-846a573a3000\" (UID: \"85d50314-7d2d-4d92-9a78-846a573a3000\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-cache" (OuterVolumeSpecName: "cache") pod "85d50314-7d2d-4d92-9a78-846a573a3000" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512605 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512679 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01351004-ea7d-4973-9dd2-859022a35edb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512695 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwmjd\" (UniqueName: \"kubernetes.io/projected/01351004-ea7d-4973-9dd2-859022a35edb-kube-api-access-mwmjd\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512710 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512722 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.512734 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01351004-ea7d-4973-9dd2-859022a35edb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.513820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-lock" (OuterVolumeSpecName: "lock") pod "85d50314-7d2d-4d92-9a78-846a573a3000" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.514876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "85d50314-7d2d-4d92-9a78-846a573a3000" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.515355 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-kube-api-access-zcbxq" (OuterVolumeSpecName: "kube-api-access-zcbxq") pod "85d50314-7d2d-4d92-9a78-846a573a3000" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000"). InnerVolumeSpecName "kube-api-access-zcbxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.515780 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "85d50314-7d2d-4d92-9a78-846a573a3000" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.613591 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.613618 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcbxq\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-kube-api-access-zcbxq\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.613628 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85d50314-7d2d-4d92-9a78-846a573a3000-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.613637 4749 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-lock\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.613646 4749 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/85d50314-7d2d-4d92-9a78-846a573a3000-cache\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.623086 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bd2hf_e03a8285-2164-42a8-8887-95bdaf021a73/ovs-vswitchd/0.log" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.624308 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.626979 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-log\") pod \"e03a8285-2164-42a8-8887-95bdaf021a73\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714398 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-log" (OuterVolumeSpecName: "var-log") pod "e03a8285-2164-42a8-8887-95bdaf021a73" (UID: "e03a8285-2164-42a8-8887-95bdaf021a73"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrltg\" (UniqueName: \"kubernetes.io/projected/e03a8285-2164-42a8-8887-95bdaf021a73-kube-api-access-nrltg\") pod \"e03a8285-2164-42a8-8887-95bdaf021a73\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03a8285-2164-42a8-8887-95bdaf021a73-scripts\") pod \"e03a8285-2164-42a8-8887-95bdaf021a73\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-lib\") pod \"e03a8285-2164-42a8-8887-95bdaf021a73\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-etc-ovs\") pod \"e03a8285-2164-42a8-8887-95bdaf021a73\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-run\") pod \"e03a8285-2164-42a8-8887-95bdaf021a73\" (UID: \"e03a8285-2164-42a8-8887-95bdaf021a73\") " Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.714965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-lib" (OuterVolumeSpecName: "var-lib") pod "e03a8285-2164-42a8-8887-95bdaf021a73" (UID: "e03a8285-2164-42a8-8887-95bdaf021a73"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-run" (OuterVolumeSpecName: "var-run") pod "e03a8285-2164-42a8-8887-95bdaf021a73" (UID: "e03a8285-2164-42a8-8887-95bdaf021a73"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "e03a8285-2164-42a8-8887-95bdaf021a73" (UID: "e03a8285-2164-42a8-8887-95bdaf021a73"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715270 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-lib\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715290 4749 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715402 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715420 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.715432 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e03a8285-2164-42a8-8887-95bdaf021a73-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.717022 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03a8285-2164-42a8-8887-95bdaf021a73-scripts" (OuterVolumeSpecName: "scripts") pod "e03a8285-2164-42a8-8887-95bdaf021a73" (UID: "e03a8285-2164-42a8-8887-95bdaf021a73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.718055 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03a8285-2164-42a8-8887-95bdaf021a73-kube-api-access-nrltg" (OuterVolumeSpecName: "kube-api-access-nrltg") pod "e03a8285-2164-42a8-8887-95bdaf021a73" (UID: "e03a8285-2164-42a8-8887-95bdaf021a73"). InnerVolumeSpecName "kube-api-access-nrltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.778090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d50314-7d2d-4d92-9a78-846a573a3000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85d50314-7d2d-4d92-9a78-846a573a3000" (UID: "85d50314-7d2d-4d92-9a78-846a573a3000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.817135 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrltg\" (UniqueName: \"kubernetes.io/projected/e03a8285-2164-42a8-8887-95bdaf021a73-kube-api-access-nrltg\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.817242 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03a8285-2164-42a8-8887-95bdaf021a73-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:57 crc kubenswrapper[4749]: I0310 16:12:57.817281 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d50314-7d2d-4d92-9a78-846a573a3000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.054762 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bd2hf_e03a8285-2164-42a8-8887-95bdaf021a73/ovs-vswitchd/0.log" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.055666 4749 generic.go:334] "Generic (PLEG): container finished" podID="e03a8285-2164-42a8-8887-95bdaf021a73" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" exitCode=137 Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.055703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerDied","Data":"524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac"} Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.055746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bd2hf" event={"ID":"e03a8285-2164-42a8-8887-95bdaf021a73","Type":"ContainerDied","Data":"0d9b95003cfa55cb42355742a8b55d563f4484029b151a5419fa30d0458a203a"} Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.055759 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bd2hf" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.055769 4749 scope.go:117] "RemoveContainer" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.067552 4749 generic.go:334] "Generic (PLEG): container finished" podID="85d50314-7d2d-4d92-9a78-846a573a3000" containerID="bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576" exitCode=137 Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.067654 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.067691 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.067737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576"} Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.067760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"85d50314-7d2d-4d92-9a78-846a573a3000","Type":"ContainerDied","Data":"47f134fe8c0bcee5c546e8282ad7f5660df60ec57745ea899b947150cdb2e3fe"} Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.110140 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.122733 4749 scope.go:117] "RemoveContainer" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.125590 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.142146 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-bd2hf"] Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.148493 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-bd2hf"] Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.150407 4749 scope.go:117] "RemoveContainer" containerID="073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.154266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.160822 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.181094 4749 scope.go:117] "RemoveContainer" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.181746 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac\": container with ID starting with 524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac not found: ID does not exist" containerID="524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.181786 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac"} err="failed to get container status \"524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac\": rpc error: code = NotFound desc = could not find container \"524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac\": container with ID starting with 524f7c79609f4f31e9895db678319d2988fbc270e251ac2bed7ccde1f66f67ac not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.181818 4749 scope.go:117] "RemoveContainer" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.182279 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f\": container with ID starting with 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f not found: ID does not exist" containerID="091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.182308 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f"} err="failed to get container status \"091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f\": rpc error: code = NotFound desc = could not find container \"091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f\": container with ID starting with 091210c3a5ec67eec904c2b4876480a1f2be3211f9d6c48dc164057b3b18747f not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.182338 4749 scope.go:117] "RemoveContainer" containerID="073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.182986 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834\": container with ID starting with 073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834 not found: ID does not exist" containerID="073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.183016 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834"} err="failed to get container status \"073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834\": rpc error: code = NotFound desc = could not find container \"073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834\": container with ID starting with 073b3e5ffdefbc2f6d38d2adf80cce91db5bfc5892e2861424a45f5e8e3a1834 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.183039 4749 scope.go:117] "RemoveContainer" containerID="bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.211246 4749 scope.go:117] "RemoveContainer" containerID="250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.234057 4749 scope.go:117] "RemoveContainer" containerID="3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.262312 4749 scope.go:117] "RemoveContainer" containerID="939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.281452 4749 scope.go:117] "RemoveContainer" containerID="74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.301745 4749 scope.go:117] "RemoveContainer" containerID="17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.321830 4749 scope.go:117] "RemoveContainer" containerID="9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.340489 4749 scope.go:117] "RemoveContainer" containerID="8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.359364 4749 scope.go:117] "RemoveContainer" containerID="f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.383414 4749 scope.go:117] "RemoveContainer" containerID="053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.403919 4749 scope.go:117] "RemoveContainer" containerID="65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.421526 4749 scope.go:117] "RemoveContainer" containerID="2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.453317 4749 scope.go:117] "RemoveContainer" containerID="419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.479275 4749 scope.go:117] "RemoveContainer" containerID="69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.508887 4749 scope.go:117] "RemoveContainer" containerID="29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.531837 4749 scope.go:117] "RemoveContainer" containerID="bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.532314 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576\": container with ID starting with bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576 not found: ID does not exist" containerID="bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.532342 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576"} err="failed to get container status \"bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576\": rpc error: code = NotFound desc = could not find container \"bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576\": container with ID starting with bb3018fbeb8ae7b1c647090c0018a4a47752b2fecaeeb8c6590810e66f9aa576 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.532363 4749 scope.go:117] "RemoveContainer" containerID="250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.532846 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177\": container with ID starting with 250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177 not found: ID does not exist" containerID="250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.532904 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177"} err="failed to get container status \"250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177\": rpc error: code = NotFound desc = could not find container \"250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177\": container with ID starting with 250241ced61fd2f305f7eac66ea651ce382c5d2b773ef64a2685a3c7d8d51177 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.532938 4749 scope.go:117] "RemoveContainer" containerID="3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.533351 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5\": container with ID starting with 3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5 not found: ID does not exist" containerID="3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.533392 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5"} err="failed to get container status \"3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5\": rpc error: code = NotFound desc = could not find container \"3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5\": container with ID starting with 3166a1f44ecaa58fb6e63606a58aaeab5a00e8125684017b580c9d113d9e28b5 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.533412 4749 scope.go:117] "RemoveContainer" containerID="939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.533972 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb\": container with ID starting with 939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb not found: ID does not exist" containerID="939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.534028 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb"} err="failed to get container status \"939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb\": rpc error: code = NotFound desc = could not find container \"939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb\": container with ID starting with 939fd72e5f26c025b8bd5f50417db1cf59a2598dfacc057fb8c21e97844cbbfb not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.534067 4749 scope.go:117] "RemoveContainer" containerID="74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.534510 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29\": container with ID starting with 74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29 not found: ID does not exist" containerID="74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.534547 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29"} err="failed to get container status \"74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29\": rpc error: code = NotFound desc = could not find container \"74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29\": container with ID starting with 74b8af8f2db1800275caf1f0c1dd54c407bf2a89888af0cb0f77a6137eefaa29 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.534568 4749 scope.go:117] "RemoveContainer" containerID="17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.534827 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a\": container with ID starting with 17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a not found: ID does not exist" containerID="17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.534855 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a"} err="failed to get container status \"17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a\": rpc error: code = NotFound desc = could not find container \"17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a\": container with ID starting with 17f777430cc879db7bf9f1c9d488b86862db908897af54350d46a2360cb2549a not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.534870 4749 scope.go:117] "RemoveContainer" containerID="9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.535290 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f\": container with ID starting with 9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f not found: ID does not exist" containerID="9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.535313 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f"} err="failed to get container status \"9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f\": rpc error: code = NotFound desc = could not find container \"9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f\": container with ID starting with 9e2d6b3f1436ccf55b4a7e8f114d51974e8455865ec7c14f9e8e310b24cbd46f not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.535326 4749 scope.go:117] "RemoveContainer" containerID="8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.535771 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc\": container with ID starting with 8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc not found: ID does not exist" containerID="8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.535814 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc"} err="failed to get container status \"8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc\": rpc error: code = NotFound desc = could not find container \"8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc\": container with ID starting with 8e3dc189fd5a5f36d1ed3928478b796a36ebae234d6abfa35187c0cc7daab6fc not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.535902 4749 scope.go:117] "RemoveContainer" containerID="f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.536248 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3\": container with ID starting with f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3 not found: ID does not exist" containerID="f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.536271 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3"} err="failed to get container status \"f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3\": rpc error: code = NotFound desc = could not find container \"f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3\": container with ID starting with f5efe081840048441e1beed7605a3cb1367701bf79c16745edeb89014e57cfd3 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.536291 4749 scope.go:117] "RemoveContainer" containerID="053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.536664 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d\": container with ID starting with 053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d not found: ID does not exist" containerID="053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.536702 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d"} err="failed to get container status \"053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d\": rpc error: code = NotFound desc = could not find container \"053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d\": container with ID starting with 053fe9a47af41f4524a0e07b8bbd05a0e52c4cfbfed225b9dbae228562ae848d not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.536725 4749 scope.go:117] "RemoveContainer" containerID="65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.537155 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9\": container with ID starting with 65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9 not found: ID does not exist" containerID="65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.537182 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9"} err="failed to get container status \"65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9\": rpc error: code = NotFound desc = could not find container \"65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9\": container with ID starting with 65364d61dd576038476454d20bf73aad662c4d66d9c6d66420bac6b5eaf2e5a9 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.537197 4749 scope.go:117] "RemoveContainer" containerID="2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.537483 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c\": container with ID starting with 2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c not found: ID does not exist" containerID="2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.537525 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c"} err="failed to get container status \"2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c\": rpc error: code = NotFound desc = could not find container \"2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c\": container with ID starting with 2efb77202f22e2883bdd91f0e2cbe30b348e7e743f36692bb368eee05f179b1c not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.537546 4749 scope.go:117] "RemoveContainer" containerID="419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.537949 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065\": container with ID starting with 419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065 not found: ID does not exist" containerID="419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.537974 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065"} err="failed to get container status \"419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065\": rpc error: code = NotFound desc = could not find container \"419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065\": container with ID starting with 419a6797b9dc5a25731dd54073656c1ca9fe4985a5320bbf58bad0ef2faab065 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.537989 4749 scope.go:117] "RemoveContainer" containerID="69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.538351 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620\": container with ID starting with 69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620 not found: ID does not exist" containerID="69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.538398 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620"} err="failed to get container status \"69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620\": rpc error: code = NotFound desc = could not find container \"69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620\": container with ID starting with 69a1fa408b996e90e4ce6fd72aa7dc3a695dcdd428a8dbfb31ec84cff99d7620 not found: ID does not exist" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.538416 4749 scope.go:117] "RemoveContainer" containerID="29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5" Mar 10 16:12:58 crc kubenswrapper[4749]: E0310 16:12:58.538830 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5\": container with ID starting with 29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5 not found: ID does not exist" containerID="29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5" Mar 10 16:12:58 crc kubenswrapper[4749]: I0310 16:12:58.538861 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5"} err="failed to get container status \"29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5\": rpc error: code = NotFound desc = could not find container \"29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5\": container with ID starting with 29ac57ed16691afc977b4681e0422ae230ea59a572f4650e3bdddd776aa79bb5 not found: ID does not exist" Mar 10 16:12:59 crc kubenswrapper[4749]: I0310 16:12:59.615972 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01351004-ea7d-4973-9dd2-859022a35edb" path="/var/lib/kubelet/pods/01351004-ea7d-4973-9dd2-859022a35edb/volumes" Mar 10 16:12:59 crc kubenswrapper[4749]: I0310 16:12:59.617490 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" path="/var/lib/kubelet/pods/85d50314-7d2d-4d92-9a78-846a573a3000/volumes" Mar 10 16:12:59 crc kubenswrapper[4749]: I0310 16:12:59.619785 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" path="/var/lib/kubelet/pods/e03a8285-2164-42a8-8887-95bdaf021a73/volumes" Mar 10 16:13:20 crc kubenswrapper[4749]: I0310 16:13:20.980627 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:13:20 crc kubenswrapper[4749]: I0310 16:13:20.981656 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:13:50 crc kubenswrapper[4749]: I0310 16:13:50.980894 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:13:50 crc kubenswrapper[4749]: I0310 16:13:50.981982 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.140918 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552654-cpwdb"] Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141814 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141829 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-server" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141840 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8246b1-28b8-4eb6-83a3-1e87beecfb78" containerName="kube-state-metrics" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141846 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8246b1-28b8-4eb6-83a3-1e87beecfb78" containerName="kube-state-metrics" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141859 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-metadata" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141866 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-metadata" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141880 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7637a97-25f4-4696-a41c-545d0d6b0e9a" containerName="keystone-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141885 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7637a97-25f4-4696-a41c-545d0d6b0e9a" containerName="keystone-api" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141897 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141902 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-log" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141911 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-notification-agent" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141918 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-notification-agent" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141925 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141930 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-log" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141938 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="swift-recon-cron" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141944 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="swift-recon-cron" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141954 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141959 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141967 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-updater" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141972 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-updater" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141981 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.141986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.141996 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142001 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142008 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142014 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142022 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142028 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-api" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142034 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142039 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142050 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142055 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142063 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="ovn-northd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142068 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="ovn-northd" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142078 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="proxy-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142084 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="proxy-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142093 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142099 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142110 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="probe" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142118 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="probe" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142125 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142134 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-log" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142143 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec710cfc-8539-47c5-8062-95911f973074" containerName="memcached" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142148 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec710cfc-8539-47c5-8062-95911f973074" containerName="memcached" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142158 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="setup-container" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142163 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="setup-container" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142170 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-reaper" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142176 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-reaper" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142189 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-api" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142198 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="cinder-scheduler" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142205 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="cinder-scheduler" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142217 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142225 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142234 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142240 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142250 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="openstack-network-exporter" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142256 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="openstack-network-exporter" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142263 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="rsync" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142269 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="rsync" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142276 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerName="nova-cell0-conductor-conductor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142282 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerName="nova-cell0-conductor-conductor" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142293 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142300 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142312 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server-init" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142319 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server-init" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142329 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80985ef-0a5d-403a-b351-c59bd878723d" containerName="nova-cell1-conductor-conductor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142337 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80985ef-0a5d-403a-b351-c59bd878723d" containerName="nova-cell1-conductor-conductor" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142349 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="rabbitmq" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142356 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="rabbitmq" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142363 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142371 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-server" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142469 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="rabbitmq" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142477 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="rabbitmq" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142487 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142495 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-api" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142505 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-updater" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142513 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-updater" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142522 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142529 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142543 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142550 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142560 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="setup-container" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142567 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="setup-container" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142577 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="sg-core" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142585 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="sg-core" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142593 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142600 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142612 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac12213-5bcb-465c-a6aa-fa9e8e97c290" containerName="mariadb-account-create-update" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142619 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac12213-5bcb-465c-a6aa-fa9e8e97c290" containerName="mariadb-account-create-update" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142634 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-expirer" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142641 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-expirer" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142654 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerName="mysql-bootstrap" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142661 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerName="mysql-bootstrap" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142671 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-central-agent" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142678 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-central-agent" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142691 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142698 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-log" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142711 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142719 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142730 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e39f11-450f-43a3-ba72-7c3e8245e382" containerName="nova-scheduler-scheduler" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142737 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e39f11-450f-43a3-ba72-7c3e8245e382" containerName="nova-scheduler-scheduler" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142750 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142756 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-server" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142766 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142772 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142785 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerName="galera" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142792 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerName="galera" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142801 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142808 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: E0310 16:14:00.142818 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142825 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.142995 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-updater" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143009 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143022 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143033 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="236aa9f6-5238-45de-813d-e0b18c887f64" containerName="neutron-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143045 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad2c472-e0a5-43d7-971e-a242a578042b" containerName="ovn-controller" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143053 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="sg-core" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143067 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-central-agent" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143079 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-updater" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143089 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143102 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143112 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="proxy-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143123 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc64163-530a-4b31-9acc-84910336b781" containerName="placement-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143135 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf7c072-7f7d-4f94-98a5-023b069f0eab" containerName="galera" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143145 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e39f11-450f-43a3-ba72-7c3e8245e382" containerName="nova-scheduler-scheduler" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143159 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8246b1-28b8-4eb6-83a3-1e87beecfb78" containerName="kube-state-metrics" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143167 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="ovn-northd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143178 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-metadata" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143187 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec710cfc-8539-47c5-8062-95911f973074" containerName="memcached" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143197 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7637a97-25f4-4696-a41c-545d0d6b0e9a" containerName="keystone-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143207 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143220 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143230 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143241 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143249 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143257 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="cinder-scheduler" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143269 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovs-vswitchd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143278 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac12213-5bcb-465c-a6aa-fa9e8e97c290" containerName="mariadb-account-create-update" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143289 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="15480433-b4c2-47c5-a7e4-73395b5bd27d" containerName="glance-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143300 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-httpd" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143307 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="account-reaper" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143315 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143325 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-expirer" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143336 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d845ea-a98a-43ae-9803-30e5d306d29d" containerName="cinder-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143343 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61221be-c05f-47ae-a3b5-80f59d809281" containerName="nova-metadata-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143354 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143363 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="container-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143393 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="swift-recon-cron" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143403 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="rsync" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143412 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-replicator" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143424 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4333fd-3a72-41c7-a82a-448ed0ccfc1e" containerName="nova-api-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143434 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80985ef-0a5d-403a-b351-c59bd878723d" containerName="nova-cell1-conductor-conductor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143444 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b598099-b3f7-4157-8e5f-6eb472806511" containerName="glance-log" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143453 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31b4d97-4ea8-411f-873a-1ad6c133b917" containerName="nova-cell0-conductor-conductor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143463 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143472 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34f67ec-ba88-43c9-84af-2c59a2dbbbe3" containerName="rabbitmq" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143478 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01351004-ea7d-4973-9dd2-859022a35edb" containerName="probe" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143490 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3d73b4-812e-4486-8467-87c6dfd6ee92" containerName="ceilometer-notification-agent" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143501 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e794ff07-5e05-4d6c-8cc6-64efd90fd91b" containerName="openstack-network-exporter" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143510 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03a8285-2164-42a8-8887-95bdaf021a73" containerName="ovsdb-server" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143519 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1feaa4c9-2cec-45a8-9106-5be885c26eae" containerName="rabbitmq" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143528 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8a90f3-a6d3-428e-a049-78cb36e2ed34" containerName="barbican-api" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.143537 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d50314-7d2d-4d92-9a78-846a573a3000" containerName="object-auditor" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.144050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.147909 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.148124 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.148301 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.153758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-cpwdb"] Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.261564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69f2t\" (UniqueName: \"kubernetes.io/projected/7445876f-74e1-4179-a304-c71e79e6d5d8-kube-api-access-69f2t\") pod \"auto-csr-approver-29552654-cpwdb\" (UID: \"7445876f-74e1-4179-a304-c71e79e6d5d8\") " pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.363491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69f2t\" (UniqueName: \"kubernetes.io/projected/7445876f-74e1-4179-a304-c71e79e6d5d8-kube-api-access-69f2t\") pod \"auto-csr-approver-29552654-cpwdb\" (UID: \"7445876f-74e1-4179-a304-c71e79e6d5d8\") " pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.382767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69f2t\" (UniqueName: \"kubernetes.io/projected/7445876f-74e1-4179-a304-c71e79e6d5d8-kube-api-access-69f2t\") pod \"auto-csr-approver-29552654-cpwdb\" (UID: \"7445876f-74e1-4179-a304-c71e79e6d5d8\") " pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.464605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.880971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-cpwdb"] Mar 10 16:14:00 crc kubenswrapper[4749]: I0310 16:14:00.889985 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:14:01 crc kubenswrapper[4749]: I0310 16:14:01.672669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" event={"ID":"7445876f-74e1-4179-a304-c71e79e6d5d8","Type":"ContainerStarted","Data":"3f5abc17f69a6f10eb162d4565f36713accbe63e248b4bfa834beb388a43962f"} Mar 10 16:14:02 crc kubenswrapper[4749]: I0310 16:14:02.681065 4749 generic.go:334] "Generic (PLEG): container finished" podID="7445876f-74e1-4179-a304-c71e79e6d5d8" containerID="6158da3960733661287c4fa53fec91bc77be2aecd94c8547723954fe2ad29c49" exitCode=0 Mar 10 16:14:02 crc kubenswrapper[4749]: I0310 16:14:02.681198 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" event={"ID":"7445876f-74e1-4179-a304-c71e79e6d5d8","Type":"ContainerDied","Data":"6158da3960733661287c4fa53fec91bc77be2aecd94c8547723954fe2ad29c49"} Mar 10 16:14:03 crc kubenswrapper[4749]: I0310 16:14:03.954267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:04 crc kubenswrapper[4749]: I0310 16:14:04.026981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69f2t\" (UniqueName: \"kubernetes.io/projected/7445876f-74e1-4179-a304-c71e79e6d5d8-kube-api-access-69f2t\") pod \"7445876f-74e1-4179-a304-c71e79e6d5d8\" (UID: \"7445876f-74e1-4179-a304-c71e79e6d5d8\") " Mar 10 16:14:04 crc kubenswrapper[4749]: I0310 16:14:04.031981 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7445876f-74e1-4179-a304-c71e79e6d5d8-kube-api-access-69f2t" (OuterVolumeSpecName: "kube-api-access-69f2t") pod "7445876f-74e1-4179-a304-c71e79e6d5d8" (UID: "7445876f-74e1-4179-a304-c71e79e6d5d8"). InnerVolumeSpecName "kube-api-access-69f2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:14:04 crc kubenswrapper[4749]: I0310 16:14:04.128834 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69f2t\" (UniqueName: \"kubernetes.io/projected/7445876f-74e1-4179-a304-c71e79e6d5d8-kube-api-access-69f2t\") on node \"crc\" DevicePath \"\"" Mar 10 16:14:04 crc kubenswrapper[4749]: I0310 16:14:04.701154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" event={"ID":"7445876f-74e1-4179-a304-c71e79e6d5d8","Type":"ContainerDied","Data":"3f5abc17f69a6f10eb162d4565f36713accbe63e248b4bfa834beb388a43962f"} Mar 10 16:14:04 crc kubenswrapper[4749]: I0310 16:14:04.701188 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f5abc17f69a6f10eb162d4565f36713accbe63e248b4bfa834beb388a43962f" Mar 10 16:14:04 crc kubenswrapper[4749]: I0310 16:14:04.701493 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552654-cpwdb" Mar 10 16:14:05 crc kubenswrapper[4749]: I0310 16:14:05.023087 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-q6zd8"] Mar 10 16:14:05 crc kubenswrapper[4749]: I0310 16:14:05.024350 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-q6zd8"] Mar 10 16:14:05 crc kubenswrapper[4749]: I0310 16:14:05.617364 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e7e543-1b7f-49b7-af99-2677c8b0cd2c" path="/var/lib/kubelet/pods/75e7e543-1b7f-49b7-af99-2677c8b0cd2c/volumes" Mar 10 16:14:20 crc kubenswrapper[4749]: I0310 16:14:20.980220 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:14:20 crc kubenswrapper[4749]: I0310 16:14:20.980735 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:14:20 crc kubenswrapper[4749]: I0310 16:14:20.980790 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:14:20 crc kubenswrapper[4749]: I0310 16:14:20.981520 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f872d8a7b451ffda56ecf850ae4189e8cc776f81c01a3e3b24b44b064505fbb1"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:14:20 crc kubenswrapper[4749]: I0310 16:14:20.981588 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://f872d8a7b451ffda56ecf850ae4189e8cc776f81c01a3e3b24b44b064505fbb1" gracePeriod=600 Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.598515 4749 scope.go:117] "RemoveContainer" containerID="940bacbed596a6b64b32192506d5b4b3e282715aad28c953f1f7f4c388805cb7" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.625875 4749 scope.go:117] "RemoveContainer" containerID="e7b6c31999ed5399571573a74497b8409d438bd24ed0481b690c7a327019e8a7" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.650559 4749 scope.go:117] "RemoveContainer" containerID="ab8c1ce0bf3c8cbe9d4fea8597af8d5906a17d426930351c4a8cef5fb3330560" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.672668 4749 scope.go:117] "RemoveContainer" containerID="581d4900012f58fdc4fb204f974c1591440e9e62f7d5ce15a5af0e18990a645c" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.693559 4749 scope.go:117] "RemoveContainer" containerID="5371680630eb3603a303a6b8c490e07844917998f70f368075231557f3230b3f" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.716569 4749 scope.go:117] "RemoveContainer" containerID="e4bba11be08f890ec02580c33cdb287c152b439af58261322ddf2d38ca38ae5a" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.751529 4749 scope.go:117] "RemoveContainer" containerID="fa88134a23781b7668fa48a94dc185d173bf70de626fb3da7c54c3bf7925c7e0" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.769441 4749 scope.go:117] "RemoveContainer" containerID="d4691533207b9e7038404071271c9309ab895a8313e0edf96cf9f1552bd59949" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.786443 4749 scope.go:117] "RemoveContainer" containerID="d750aa36eabfd8fb7f0b2d62c0c5e42060ba161fab559795c901f4a4f19af276" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.803359 4749 scope.go:117] "RemoveContainer" containerID="85e9d1c09ca680177c97807f990b0a5b07025c1d5b2201fcbd4caf12b78fdf20" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.821122 4749 scope.go:117] "RemoveContainer" containerID="36ccb0b67d01f8e0c84de941ec69aa7a4955ba535082448c8b6fccd6ff57bdab" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.841357 4749 scope.go:117] "RemoveContainer" containerID="45eb6214436b2b1d33093e3aaa79629869ba461dde4208aab226427673052ba4" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.860446 4749 scope.go:117] "RemoveContainer" containerID="2a9d4b2ad47f1e5cf7acadb4a18e700e4bae86a432b69c0791bc4866a03bbcf6" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.871412 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="f872d8a7b451ffda56ecf850ae4189e8cc776f81c01a3e3b24b44b064505fbb1" exitCode=0 Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.871506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"f872d8a7b451ffda56ecf850ae4189e8cc776f81c01a3e3b24b44b064505fbb1"} Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.871540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa"} Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.871569 4749 scope.go:117] "RemoveContainer" containerID="106da756b634d444f1a07a98c656ecf91e046a9d0f74a54a7001a123a154d3af" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.881975 4749 scope.go:117] "RemoveContainer" containerID="8f39916e690931ab258bcb7e6275112f699ad8897bb791ad2e833c45747e6c9f" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.913340 4749 scope.go:117] "RemoveContainer" containerID="04134bf7a94714ff5e0058f41e64303d4889270f61ee60ea65b0f477b594285b" Mar 10 16:14:21 crc kubenswrapper[4749]: I0310 16:14:21.936351 4749 scope.go:117] "RemoveContainer" containerID="3c06664a064c452407ad2995d00c78f80c14c3e61acf677ef85709e699cae912" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.172130 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x"] Mar 10 16:15:00 crc kubenswrapper[4749]: E0310 16:15:00.173026 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7445876f-74e1-4179-a304-c71e79e6d5d8" containerName="oc" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.173041 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7445876f-74e1-4179-a304-c71e79e6d5d8" containerName="oc" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.173227 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7445876f-74e1-4179-a304-c71e79e6d5d8" containerName="oc" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.174003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.177734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x"] Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.182836 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.183486 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.357921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3a0be48-6c5a-4831-8371-6f60a1250eaa-secret-volume\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.358104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmw5m\" (UniqueName: \"kubernetes.io/projected/b3a0be48-6c5a-4831-8371-6f60a1250eaa-kube-api-access-xmw5m\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.358322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3a0be48-6c5a-4831-8371-6f60a1250eaa-config-volume\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.459040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3a0be48-6c5a-4831-8371-6f60a1250eaa-config-volume\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.459114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3a0be48-6c5a-4831-8371-6f60a1250eaa-secret-volume\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.459148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmw5m\" (UniqueName: \"kubernetes.io/projected/b3a0be48-6c5a-4831-8371-6f60a1250eaa-kube-api-access-xmw5m\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.460512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3a0be48-6c5a-4831-8371-6f60a1250eaa-config-volume\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.470610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3a0be48-6c5a-4831-8371-6f60a1250eaa-secret-volume\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.476540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmw5m\" (UniqueName: \"kubernetes.io/projected/b3a0be48-6c5a-4831-8371-6f60a1250eaa-kube-api-access-xmw5m\") pod \"collect-profiles-29552655-8q95x\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.507668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:00 crc kubenswrapper[4749]: I0310 16:15:00.957798 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x"] Mar 10 16:15:01 crc kubenswrapper[4749]: I0310 16:15:01.255814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" event={"ID":"b3a0be48-6c5a-4831-8371-6f60a1250eaa","Type":"ContainerStarted","Data":"6a54d61e01f2e24062754f111bb8a89a06d95070350664872098f58aad28fe0b"} Mar 10 16:15:01 crc kubenswrapper[4749]: I0310 16:15:01.256105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" event={"ID":"b3a0be48-6c5a-4831-8371-6f60a1250eaa","Type":"ContainerStarted","Data":"9365355d7a65260e9bbdf82d8f8decfec55e1fd5f1682897aceaf39f9cbe2448"} Mar 10 16:15:01 crc kubenswrapper[4749]: I0310 16:15:01.271796 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" podStartSLOduration=1.271779137 podStartE2EDuration="1.271779137s" podCreationTimestamp="2026-03-10 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:15:01.27044371 +0000 UTC m=+1598.392309397" watchObservedRunningTime="2026-03-10 16:15:01.271779137 +0000 UTC m=+1598.393644824" Mar 10 16:15:02 crc kubenswrapper[4749]: I0310 16:15:02.265237 4749 generic.go:334] "Generic (PLEG): container finished" podID="b3a0be48-6c5a-4831-8371-6f60a1250eaa" containerID="6a54d61e01f2e24062754f111bb8a89a06d95070350664872098f58aad28fe0b" exitCode=0 Mar 10 16:15:02 crc kubenswrapper[4749]: I0310 16:15:02.265280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" event={"ID":"b3a0be48-6c5a-4831-8371-6f60a1250eaa","Type":"ContainerDied","Data":"6a54d61e01f2e24062754f111bb8a89a06d95070350664872098f58aad28fe0b"} Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.600810 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.706324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmw5m\" (UniqueName: \"kubernetes.io/projected/b3a0be48-6c5a-4831-8371-6f60a1250eaa-kube-api-access-xmw5m\") pod \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.706388 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3a0be48-6c5a-4831-8371-6f60a1250eaa-secret-volume\") pod \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.706433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3a0be48-6c5a-4831-8371-6f60a1250eaa-config-volume\") pod \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\" (UID: \"b3a0be48-6c5a-4831-8371-6f60a1250eaa\") " Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.707596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a0be48-6c5a-4831-8371-6f60a1250eaa-config-volume" (OuterVolumeSpecName: "config-volume") pod "b3a0be48-6c5a-4831-8371-6f60a1250eaa" (UID: "b3a0be48-6c5a-4831-8371-6f60a1250eaa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.712243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a0be48-6c5a-4831-8371-6f60a1250eaa-kube-api-access-xmw5m" (OuterVolumeSpecName: "kube-api-access-xmw5m") pod "b3a0be48-6c5a-4831-8371-6f60a1250eaa" (UID: "b3a0be48-6c5a-4831-8371-6f60a1250eaa"). InnerVolumeSpecName "kube-api-access-xmw5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.712569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a0be48-6c5a-4831-8371-6f60a1250eaa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b3a0be48-6c5a-4831-8371-6f60a1250eaa" (UID: "b3a0be48-6c5a-4831-8371-6f60a1250eaa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.808268 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmw5m\" (UniqueName: \"kubernetes.io/projected/b3a0be48-6c5a-4831-8371-6f60a1250eaa-kube-api-access-xmw5m\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.808328 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3a0be48-6c5a-4831-8371-6f60a1250eaa-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:03 crc kubenswrapper[4749]: I0310 16:15:03.808351 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3a0be48-6c5a-4831-8371-6f60a1250eaa-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:04 crc kubenswrapper[4749]: I0310 16:15:04.285915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" event={"ID":"b3a0be48-6c5a-4831-8371-6f60a1250eaa","Type":"ContainerDied","Data":"9365355d7a65260e9bbdf82d8f8decfec55e1fd5f1682897aceaf39f9cbe2448"} Mar 10 16:15:04 crc kubenswrapper[4749]: I0310 16:15:04.286280 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9365355d7a65260e9bbdf82d8f8decfec55e1fd5f1682897aceaf39f9cbe2448" Mar 10 16:15:04 crc kubenswrapper[4749]: I0310 16:15:04.286002 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.137266 4749 scope.go:117] "RemoveContainer" containerID="697164b773efd20125629e5f2071a6d2299e396b5c4faf38e683d9df3cdd7620" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.204607 4749 scope.go:117] "RemoveContainer" containerID="73c803d28033d094dfccb98e8dc460d72e72e5409cdd11305221dfb63a577787" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.228477 4749 scope.go:117] "RemoveContainer" containerID="a1c3b653dfffe4b6f5efd250adb2e58d82df46e074759a1515886f87c5c6b213" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.272127 4749 scope.go:117] "RemoveContainer" containerID="8d2299df2487e769d3166ee36c6b6f3c511bda098ff25e59c2d9dbf96576abe9" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.315573 4749 scope.go:117] "RemoveContainer" containerID="efb0880b40e62eca1c113563c2bc06265d339a84125d6c22137d5537d40d5f14" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.348661 4749 scope.go:117] "RemoveContainer" containerID="52861deaa6b8da0398f6ad619b2142325877fc700973aeb0570d3800fe2acd5b" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.381092 4749 scope.go:117] "RemoveContainer" containerID="71646d3d80a269ec395cc73be3f051a86dc38c361242cd8c53640573173be447" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.432752 4749 scope.go:117] "RemoveContainer" containerID="6384072e7eda3838a0f4a2261ffc6756be1dc38df934506afb80c689b299737c" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.455010 4749 scope.go:117] "RemoveContainer" containerID="3ffe7e3606b9a1d0ae05453dacf818a20fdcf617da5f2909f4b18887e63f4394" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.487690 4749 scope.go:117] "RemoveContainer" containerID="1dc1d42c78b2ad1937fd19367d842bf5ad46d49552fa234892aba8a8ed5f0cf3" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.535025 4749 scope.go:117] "RemoveContainer" containerID="0ef5e6d29e7ae24fea6a3d6d6e029a7cbf120f70ff6cc023e2ac5c6e35b2fbcd" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.566435 4749 scope.go:117] "RemoveContainer" containerID="8d114f77fae8a932170e8cb48be64485ebed83fe5f6058822201930cc9f723a6" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.596916 4749 scope.go:117] "RemoveContainer" containerID="7d1730370c1d50f08602e88fdd860a08bffb34493d3059343bab04e49ffcd929" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.628474 4749 scope.go:117] "RemoveContainer" containerID="19ae9480ebac8218d9075c95d6eaac4eeee5be1571a897161bbfe018546b3e2f" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.654846 4749 scope.go:117] "RemoveContainer" containerID="d5e76815f4569aaac9c8ae12048a95a98eba3d3491725cb5136e86634b3ac30e" Mar 10 16:15:22 crc kubenswrapper[4749]: I0310 16:15:22.686615 4749 scope.go:117] "RemoveContainer" containerID="304c08e229751c0227f2fdf35b286aa22b36f511bc4c299b58bf50e726217a1f" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.347581 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rzl8l"] Mar 10 16:15:46 crc kubenswrapper[4749]: E0310 16:15:46.348503 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a0be48-6c5a-4831-8371-6f60a1250eaa" containerName="collect-profiles" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.348522 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a0be48-6c5a-4831-8371-6f60a1250eaa" containerName="collect-profiles" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.348737 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a0be48-6c5a-4831-8371-6f60a1250eaa" containerName="collect-profiles" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.349940 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.401506 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzl8l"] Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.464923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-utilities\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.464999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-catalog-content\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.465153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjj9\" (UniqueName: \"kubernetes.io/projected/90787fba-89ba-4ad8-be47-7c27396e7634-kube-api-access-8sjj9\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.566801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-utilities\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.566871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-catalog-content\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.566980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjj9\" (UniqueName: \"kubernetes.io/projected/90787fba-89ba-4ad8-be47-7c27396e7634-kube-api-access-8sjj9\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.567586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-catalog-content\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.567681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-utilities\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.589533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjj9\" (UniqueName: \"kubernetes.io/projected/90787fba-89ba-4ad8-be47-7c27396e7634-kube-api-access-8sjj9\") pod \"redhat-marketplace-rzl8l\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:46 crc kubenswrapper[4749]: I0310 16:15:46.669998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:47 crc kubenswrapper[4749]: I0310 16:15:47.150583 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzl8l"] Mar 10 16:15:47 crc kubenswrapper[4749]: I0310 16:15:47.725828 4749 generic.go:334] "Generic (PLEG): container finished" podID="90787fba-89ba-4ad8-be47-7c27396e7634" containerID="168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6" exitCode=0 Mar 10 16:15:47 crc kubenswrapper[4749]: I0310 16:15:47.725956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerDied","Data":"168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6"} Mar 10 16:15:47 crc kubenswrapper[4749]: I0310 16:15:47.726425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerStarted","Data":"4d13aefafc96e7dc5533391dc8b1a8255e72de1240e201b14605a0881db79ab1"} Mar 10 16:15:48 crc kubenswrapper[4749]: I0310 16:15:48.736872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerStarted","Data":"2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38"} Mar 10 16:15:49 crc kubenswrapper[4749]: I0310 16:15:49.749116 4749 generic.go:334] "Generic (PLEG): container finished" podID="90787fba-89ba-4ad8-be47-7c27396e7634" containerID="2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38" exitCode=0 Mar 10 16:15:49 crc kubenswrapper[4749]: I0310 16:15:49.749737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerDied","Data":"2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38"} Mar 10 16:15:50 crc kubenswrapper[4749]: I0310 16:15:50.758724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerStarted","Data":"1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21"} Mar 10 16:15:50 crc kubenswrapper[4749]: I0310 16:15:50.784856 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rzl8l" podStartSLOduration=2.317716401 podStartE2EDuration="4.784828712s" podCreationTimestamp="2026-03-10 16:15:46 +0000 UTC" firstStartedPulling="2026-03-10 16:15:47.727843569 +0000 UTC m=+1644.849709256" lastFinishedPulling="2026-03-10 16:15:50.19495588 +0000 UTC m=+1647.316821567" observedRunningTime="2026-03-10 16:15:50.778524259 +0000 UTC m=+1647.900389966" watchObservedRunningTime="2026-03-10 16:15:50.784828712 +0000 UTC m=+1647.906694399" Mar 10 16:15:56 crc kubenswrapper[4749]: I0310 16:15:56.670523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:56 crc kubenswrapper[4749]: I0310 16:15:56.671126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:56 crc kubenswrapper[4749]: I0310 16:15:56.726095 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:56 crc kubenswrapper[4749]: I0310 16:15:56.854161 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:56 crc kubenswrapper[4749]: I0310 16:15:56.958501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzl8l"] Mar 10 16:15:58 crc kubenswrapper[4749]: I0310 16:15:58.825754 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rzl8l" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="registry-server" containerID="cri-o://1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21" gracePeriod=2 Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.266450 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.360982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sjj9\" (UniqueName: \"kubernetes.io/projected/90787fba-89ba-4ad8-be47-7c27396e7634-kube-api-access-8sjj9\") pod \"90787fba-89ba-4ad8-be47-7c27396e7634\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.361182 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-catalog-content\") pod \"90787fba-89ba-4ad8-be47-7c27396e7634\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.361220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-utilities\") pod \"90787fba-89ba-4ad8-be47-7c27396e7634\" (UID: \"90787fba-89ba-4ad8-be47-7c27396e7634\") " Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.362467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-utilities" (OuterVolumeSpecName: "utilities") pod "90787fba-89ba-4ad8-be47-7c27396e7634" (UID: "90787fba-89ba-4ad8-be47-7c27396e7634"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.368034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90787fba-89ba-4ad8-be47-7c27396e7634-kube-api-access-8sjj9" (OuterVolumeSpecName: "kube-api-access-8sjj9") pod "90787fba-89ba-4ad8-be47-7c27396e7634" (UID: "90787fba-89ba-4ad8-be47-7c27396e7634"). InnerVolumeSpecName "kube-api-access-8sjj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.388386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90787fba-89ba-4ad8-be47-7c27396e7634" (UID: "90787fba-89ba-4ad8-be47-7c27396e7634"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.463209 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.463242 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90787fba-89ba-4ad8-be47-7c27396e7634-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.463253 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sjj9\" (UniqueName: \"kubernetes.io/projected/90787fba-89ba-4ad8-be47-7c27396e7634-kube-api-access-8sjj9\") on node \"crc\" DevicePath \"\"" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.838896 4749 generic.go:334] "Generic (PLEG): container finished" podID="90787fba-89ba-4ad8-be47-7c27396e7634" containerID="1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21" exitCode=0 Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.838941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerDied","Data":"1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21"} Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.838968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzl8l" event={"ID":"90787fba-89ba-4ad8-be47-7c27396e7634","Type":"ContainerDied","Data":"4d13aefafc96e7dc5533391dc8b1a8255e72de1240e201b14605a0881db79ab1"} Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.838986 4749 scope.go:117] "RemoveContainer" containerID="1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.838985 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzl8l" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.862975 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzl8l"] Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.868389 4749 scope.go:117] "RemoveContainer" containerID="2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.874732 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzl8l"] Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.888503 4749 scope.go:117] "RemoveContainer" containerID="168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.928160 4749 scope.go:117] "RemoveContainer" containerID="1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21" Mar 10 16:15:59 crc kubenswrapper[4749]: E0310 16:15:59.928786 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21\": container with ID starting with 1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21 not found: ID does not exist" containerID="1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.928853 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21"} err="failed to get container status \"1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21\": rpc error: code = NotFound desc = could not find container \"1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21\": container with ID starting with 1138132a0a70b328461f05f2f7f2191e38624deca5db515041b2cf6b90b37f21 not found: ID does not exist" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.928893 4749 scope.go:117] "RemoveContainer" containerID="2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38" Mar 10 16:15:59 crc kubenswrapper[4749]: E0310 16:15:59.929443 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38\": container with ID starting with 2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38 not found: ID does not exist" containerID="2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.929491 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38"} err="failed to get container status \"2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38\": rpc error: code = NotFound desc = could not find container \"2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38\": container with ID starting with 2a5b4000e664393147bcb319c25f5c27628e0f46e5b3357d494935a96d9d8b38 not found: ID does not exist" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.929531 4749 scope.go:117] "RemoveContainer" containerID="168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6" Mar 10 16:15:59 crc kubenswrapper[4749]: E0310 16:15:59.929873 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6\": container with ID starting with 168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6 not found: ID does not exist" containerID="168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6" Mar 10 16:15:59 crc kubenswrapper[4749]: I0310 16:15:59.929909 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6"} err="failed to get container status \"168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6\": rpc error: code = NotFound desc = could not find container \"168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6\": container with ID starting with 168f169a021bcf3031d82918f71802d63a92a32215bb689fa3acdec1d5821af6 not found: ID does not exist" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.143866 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552656-rb8kq"] Mar 10 16:16:00 crc kubenswrapper[4749]: E0310 16:16:00.144217 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="extract-content" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.144233 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="extract-content" Mar 10 16:16:00 crc kubenswrapper[4749]: E0310 16:16:00.144274 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="extract-utilities" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.144283 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="extract-utilities" Mar 10 16:16:00 crc kubenswrapper[4749]: E0310 16:16:00.144294 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="registry-server" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.144301 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="registry-server" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.144516 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" containerName="registry-server" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.145121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.148676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.149254 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.154679 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.156347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-rb8kq"] Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.275801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5mvc\" (UniqueName: \"kubernetes.io/projected/52f53359-bd54-4ae2-ac09-adf68d7a9eb8-kube-api-access-g5mvc\") pod \"auto-csr-approver-29552656-rb8kq\" (UID: \"52f53359-bd54-4ae2-ac09-adf68d7a9eb8\") " pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.376625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5mvc\" (UniqueName: \"kubernetes.io/projected/52f53359-bd54-4ae2-ac09-adf68d7a9eb8-kube-api-access-g5mvc\") pod \"auto-csr-approver-29552656-rb8kq\" (UID: \"52f53359-bd54-4ae2-ac09-adf68d7a9eb8\") " pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.395009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5mvc\" (UniqueName: \"kubernetes.io/projected/52f53359-bd54-4ae2-ac09-adf68d7a9eb8-kube-api-access-g5mvc\") pod \"auto-csr-approver-29552656-rb8kq\" (UID: \"52f53359-bd54-4ae2-ac09-adf68d7a9eb8\") " pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.461700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:00 crc kubenswrapper[4749]: I0310 16:16:00.911086 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-rb8kq"] Mar 10 16:16:01 crc kubenswrapper[4749]: I0310 16:16:01.622883 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90787fba-89ba-4ad8-be47-7c27396e7634" path="/var/lib/kubelet/pods/90787fba-89ba-4ad8-be47-7c27396e7634/volumes" Mar 10 16:16:01 crc kubenswrapper[4749]: I0310 16:16:01.860364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" event={"ID":"52f53359-bd54-4ae2-ac09-adf68d7a9eb8","Type":"ContainerStarted","Data":"30b8058db9eb23350576acd85867a4f670c138e85a35e5305f819ccc22e96ec8"} Mar 10 16:16:02 crc kubenswrapper[4749]: I0310 16:16:02.867958 4749 generic.go:334] "Generic (PLEG): container finished" podID="52f53359-bd54-4ae2-ac09-adf68d7a9eb8" containerID="f502e5e7f13f66721039f9643ff5a797150bcbce4483e694ed6659e0d94b0f12" exitCode=0 Mar 10 16:16:02 crc kubenswrapper[4749]: I0310 16:16:02.868072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" event={"ID":"52f53359-bd54-4ae2-ac09-adf68d7a9eb8","Type":"ContainerDied","Data":"f502e5e7f13f66721039f9643ff5a797150bcbce4483e694ed6659e0d94b0f12"} Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.144241 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.236850 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5mvc\" (UniqueName: \"kubernetes.io/projected/52f53359-bd54-4ae2-ac09-adf68d7a9eb8-kube-api-access-g5mvc\") pod \"52f53359-bd54-4ae2-ac09-adf68d7a9eb8\" (UID: \"52f53359-bd54-4ae2-ac09-adf68d7a9eb8\") " Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.241741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f53359-bd54-4ae2-ac09-adf68d7a9eb8-kube-api-access-g5mvc" (OuterVolumeSpecName: "kube-api-access-g5mvc") pod "52f53359-bd54-4ae2-ac09-adf68d7a9eb8" (UID: "52f53359-bd54-4ae2-ac09-adf68d7a9eb8"). InnerVolumeSpecName "kube-api-access-g5mvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.338907 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5mvc\" (UniqueName: \"kubernetes.io/projected/52f53359-bd54-4ae2-ac09-adf68d7a9eb8-kube-api-access-g5mvc\") on node \"crc\" DevicePath \"\"" Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.885288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" event={"ID":"52f53359-bd54-4ae2-ac09-adf68d7a9eb8","Type":"ContainerDied","Data":"30b8058db9eb23350576acd85867a4f670c138e85a35e5305f819ccc22e96ec8"} Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.885331 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30b8058db9eb23350576acd85867a4f670c138e85a35e5305f819ccc22e96ec8" Mar 10 16:16:04 crc kubenswrapper[4749]: I0310 16:16:04.885357 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552656-rb8kq" Mar 10 16:16:05 crc kubenswrapper[4749]: I0310 16:16:05.210555 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-qd6cs"] Mar 10 16:16:05 crc kubenswrapper[4749]: I0310 16:16:05.216998 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-qd6cs"] Mar 10 16:16:05 crc kubenswrapper[4749]: I0310 16:16:05.616430 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b7cbb5-41b0-4439-a0b9-ce126583684c" path="/var/lib/kubelet/pods/d9b7cbb5-41b0-4439-a0b9-ce126583684c/volumes" Mar 10 16:16:22 crc kubenswrapper[4749]: I0310 16:16:22.965130 4749 scope.go:117] "RemoveContainer" containerID="12b17b01d50b727c55fa469a834ed3400e907fc685dc1d7f44466afffff37d2b" Mar 10 16:16:22 crc kubenswrapper[4749]: I0310 16:16:22.993998 4749 scope.go:117] "RemoveContainer" containerID="bb17514493f3006a9700ec5156a08c7d51cbec38b230b0424cddada1c317646a" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.026302 4749 scope.go:117] "RemoveContainer" containerID="979fafe5fb14a8e96ba3c95974f251f5e9ed6197a8ab83b091aa994aadb744a2" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.045641 4749 scope.go:117] "RemoveContainer" containerID="2f422d3b9d73bc88ff42a20aa5d8b2baa02aba46433f78fa49db5d0c304bdbd6" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.067642 4749 scope.go:117] "RemoveContainer" containerID="236964c999aec36cddb5fe0239f2b923e3a235f3f2a6498c0a9402202498207b" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.107197 4749 scope.go:117] "RemoveContainer" containerID="525b3d0120628b94472bbc40c5f17df9835454c772f2b39e875c3bd06452ddd5" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.155993 4749 scope.go:117] "RemoveContainer" containerID="d67ad0da493eea572574b3e20465891a70fb4ef286a417816a82688aa213456a" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.178274 4749 scope.go:117] "RemoveContainer" containerID="217e914770081e2bccbbae1cf847983e071573c618440c545a24e7fc9b20a92f" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.210841 4749 scope.go:117] "RemoveContainer" containerID="c88c00c58bcfe5242271bb002d37c1a1a9cd3e5dd3b4b9465326ba4e737970b2" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.232316 4749 scope.go:117] "RemoveContainer" containerID="42bf975dd6dd1e70cf8f35b3e289f5e0be89fe07587e2e533e76c74511cca2f6" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.252125 4749 scope.go:117] "RemoveContainer" containerID="3c8d84eda7a09a27be2cbbd7dd5b4c073f4fa684f122a6c3187ab2737bdf593d" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.312477 4749 scope.go:117] "RemoveContainer" containerID="3aacb276a70bd2dd9c3f2e14a1f0784b86c6019fd8b98d5042d0b2baa9fde022" Mar 10 16:16:23 crc kubenswrapper[4749]: I0310 16:16:23.336307 4749 scope.go:117] "RemoveContainer" containerID="ca2411d2a7587ab136bf4f00c059256ccb860c1948771afc38ee76e061d6b749" Mar 10 16:16:50 crc kubenswrapper[4749]: I0310 16:16:50.980011 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:16:50 crc kubenswrapper[4749]: I0310 16:16:50.980690 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:17:20 crc kubenswrapper[4749]: I0310 16:17:20.980916 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:17:20 crc kubenswrapper[4749]: I0310 16:17:20.981619 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:17:23 crc kubenswrapper[4749]: I0310 16:17:23.544376 4749 scope.go:117] "RemoveContainer" containerID="17f3949302b878f73539d2cfebec421f4f33ea6abd3c520c71fdcdad36b2d285" Mar 10 16:17:23 crc kubenswrapper[4749]: I0310 16:17:23.589060 4749 scope.go:117] "RemoveContainer" containerID="b0e71ba5febb6a769078d2e5e9beb76c2914f70feafac428695b051ac58e6b77" Mar 10 16:17:23 crc kubenswrapper[4749]: I0310 16:17:23.627572 4749 scope.go:117] "RemoveContainer" containerID="d98ff629c8a6f49966351adb1d7774cf20efbbd1a83fc5945d726cf4ccbcc436" Mar 10 16:17:23 crc kubenswrapper[4749]: I0310 16:17:23.650896 4749 scope.go:117] "RemoveContainer" containerID="a96cb0fcbf7265ea7eab4e2c14814d62be59f1b97490561ea4c3657b140c4cff" Mar 10 16:17:50 crc kubenswrapper[4749]: I0310 16:17:50.980951 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:17:50 crc kubenswrapper[4749]: I0310 16:17:50.981890 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:17:50 crc kubenswrapper[4749]: I0310 16:17:50.981978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:17:50 crc kubenswrapper[4749]: I0310 16:17:50.983539 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:17:50 crc kubenswrapper[4749]: I0310 16:17:50.983644 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" gracePeriod=600 Mar 10 16:17:51 crc kubenswrapper[4749]: E0310 16:17:51.143106 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:17:51 crc kubenswrapper[4749]: I0310 16:17:51.905893 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" exitCode=0 Mar 10 16:17:51 crc kubenswrapper[4749]: I0310 16:17:51.905972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa"} Mar 10 16:17:51 crc kubenswrapper[4749]: I0310 16:17:51.906039 4749 scope.go:117] "RemoveContainer" containerID="f872d8a7b451ffda56ecf850ae4189e8cc776f81c01a3e3b24b44b064505fbb1" Mar 10 16:17:51 crc kubenswrapper[4749]: I0310 16:17:51.906998 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:17:51 crc kubenswrapper[4749]: E0310 16:17:51.907560 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.158358 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552658-zchgs"] Mar 10 16:18:00 crc kubenswrapper[4749]: E0310 16:18:00.159222 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f53359-bd54-4ae2-ac09-adf68d7a9eb8" containerName="oc" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.159239 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f53359-bd54-4ae2-ac09-adf68d7a9eb8" containerName="oc" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.159447 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f53359-bd54-4ae2-ac09-adf68d7a9eb8" containerName="oc" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.160102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.162930 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.163215 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.163420 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.175034 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-zchgs"] Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.203991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnmg\" (UniqueName: \"kubernetes.io/projected/802a5c46-b864-40c5-80d5-28dde91eaa3a-kube-api-access-qrnmg\") pod \"auto-csr-approver-29552658-zchgs\" (UID: \"802a5c46-b864-40c5-80d5-28dde91eaa3a\") " pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.305328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnmg\" (UniqueName: \"kubernetes.io/projected/802a5c46-b864-40c5-80d5-28dde91eaa3a-kube-api-access-qrnmg\") pod \"auto-csr-approver-29552658-zchgs\" (UID: \"802a5c46-b864-40c5-80d5-28dde91eaa3a\") " pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.339336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnmg\" (UniqueName: \"kubernetes.io/projected/802a5c46-b864-40c5-80d5-28dde91eaa3a-kube-api-access-qrnmg\") pod \"auto-csr-approver-29552658-zchgs\" (UID: \"802a5c46-b864-40c5-80d5-28dde91eaa3a\") " pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.489344 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.742787 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-zchgs"] Mar 10 16:18:00 crc kubenswrapper[4749]: I0310 16:18:00.984052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-zchgs" event={"ID":"802a5c46-b864-40c5-80d5-28dde91eaa3a","Type":"ContainerStarted","Data":"8a3156e6e45a492daca5440a41fafcd64db6e838b144e420972d7c83fc1e953b"} Mar 10 16:18:02 crc kubenswrapper[4749]: I0310 16:18:02.001437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-zchgs" event={"ID":"802a5c46-b864-40c5-80d5-28dde91eaa3a","Type":"ContainerStarted","Data":"cc530f1797b3e1b061394005ac4d9b380ef006602c5fbd08c91af5c4d9a6213d"} Mar 10 16:18:03 crc kubenswrapper[4749]: I0310 16:18:03.014448 4749 generic.go:334] "Generic (PLEG): container finished" podID="802a5c46-b864-40c5-80d5-28dde91eaa3a" containerID="cc530f1797b3e1b061394005ac4d9b380ef006602c5fbd08c91af5c4d9a6213d" exitCode=0 Mar 10 16:18:03 crc kubenswrapper[4749]: I0310 16:18:03.014610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-zchgs" event={"ID":"802a5c46-b864-40c5-80d5-28dde91eaa3a","Type":"ContainerDied","Data":"cc530f1797b3e1b061394005ac4d9b380ef006602c5fbd08c91af5c4d9a6213d"} Mar 10 16:18:04 crc kubenswrapper[4749]: I0310 16:18:04.355972 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:04 crc kubenswrapper[4749]: I0310 16:18:04.470576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrnmg\" (UniqueName: \"kubernetes.io/projected/802a5c46-b864-40c5-80d5-28dde91eaa3a-kube-api-access-qrnmg\") pod \"802a5c46-b864-40c5-80d5-28dde91eaa3a\" (UID: \"802a5c46-b864-40c5-80d5-28dde91eaa3a\") " Mar 10 16:18:04 crc kubenswrapper[4749]: I0310 16:18:04.479109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802a5c46-b864-40c5-80d5-28dde91eaa3a-kube-api-access-qrnmg" (OuterVolumeSpecName: "kube-api-access-qrnmg") pod "802a5c46-b864-40c5-80d5-28dde91eaa3a" (UID: "802a5c46-b864-40c5-80d5-28dde91eaa3a"). InnerVolumeSpecName "kube-api-access-qrnmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:18:04 crc kubenswrapper[4749]: I0310 16:18:04.572526 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrnmg\" (UniqueName: \"kubernetes.io/projected/802a5c46-b864-40c5-80d5-28dde91eaa3a-kube-api-access-qrnmg\") on node \"crc\" DevicePath \"\"" Mar 10 16:18:05 crc kubenswrapper[4749]: I0310 16:18:05.040007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552658-zchgs" event={"ID":"802a5c46-b864-40c5-80d5-28dde91eaa3a","Type":"ContainerDied","Data":"8a3156e6e45a492daca5440a41fafcd64db6e838b144e420972d7c83fc1e953b"} Mar 10 16:18:05 crc kubenswrapper[4749]: I0310 16:18:05.040065 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a3156e6e45a492daca5440a41fafcd64db6e838b144e420972d7c83fc1e953b" Mar 10 16:18:05 crc kubenswrapper[4749]: I0310 16:18:05.040111 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552658-zchgs" Mar 10 16:18:05 crc kubenswrapper[4749]: I0310 16:18:05.116253 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-x8p4n"] Mar 10 16:18:05 crc kubenswrapper[4749]: I0310 16:18:05.123646 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-x8p4n"] Mar 10 16:18:05 crc kubenswrapper[4749]: I0310 16:18:05.622767 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8077178-4215-4f40-8aff-2dd8f4766821" path="/var/lib/kubelet/pods/e8077178-4215-4f40-8aff-2dd8f4766821/volumes" Mar 10 16:18:06 crc kubenswrapper[4749]: I0310 16:18:06.607161 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:18:06 crc kubenswrapper[4749]: E0310 16:18:06.607877 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:18:21 crc kubenswrapper[4749]: I0310 16:18:21.607248 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:18:21 crc kubenswrapper[4749]: E0310 16:18:21.608039 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:18:23 crc kubenswrapper[4749]: I0310 16:18:23.720985 4749 scope.go:117] "RemoveContainer" containerID="72f8a33df83b0d373bbc75a2ec6ea7a30edc86dd30840557b6a3948d5156af59" Mar 10 16:18:23 crc kubenswrapper[4749]: I0310 16:18:23.791185 4749 scope.go:117] "RemoveContainer" containerID="c5c3ca0f09ffdc3b0ca768bffd9853e585b15e8e8d35502eee9fe5cdf2e81621" Mar 10 16:18:23 crc kubenswrapper[4749]: I0310 16:18:23.821722 4749 scope.go:117] "RemoveContainer" containerID="fec2e3f9d6052f4d4ff97e50b449c59e790b93ff8700dd853168b392f78e6839" Mar 10 16:18:23 crc kubenswrapper[4749]: I0310 16:18:23.851135 4749 scope.go:117] "RemoveContainer" containerID="696da172e02fe62a97048ef4e054fc5dd19a2680cfd820003727745d5cc14db9" Mar 10 16:18:36 crc kubenswrapper[4749]: I0310 16:18:36.607704 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:18:36 crc kubenswrapper[4749]: E0310 16:18:36.609275 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:18:48 crc kubenswrapper[4749]: I0310 16:18:48.607116 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:18:48 crc kubenswrapper[4749]: E0310 16:18:48.607971 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:19:03 crc kubenswrapper[4749]: I0310 16:19:03.610550 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:19:03 crc kubenswrapper[4749]: E0310 16:19:03.611140 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:19:16 crc kubenswrapper[4749]: I0310 16:19:16.606917 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:19:16 crc kubenswrapper[4749]: E0310 16:19:16.607513 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:19:28 crc kubenswrapper[4749]: I0310 16:19:28.607666 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:19:28 crc kubenswrapper[4749]: E0310 16:19:28.608738 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:19:43 crc kubenswrapper[4749]: I0310 16:19:43.610508 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:19:43 crc kubenswrapper[4749]: E0310 16:19:43.611469 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.718392 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cdmbq"] Mar 10 16:19:56 crc kubenswrapper[4749]: E0310 16:19:56.719218 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802a5c46-b864-40c5-80d5-28dde91eaa3a" containerName="oc" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.719233 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="802a5c46-b864-40c5-80d5-28dde91eaa3a" containerName="oc" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.719428 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="802a5c46-b864-40c5-80d5-28dde91eaa3a" containerName="oc" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.720574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.732078 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdmbq"] Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.860408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht4dd\" (UniqueName: \"kubernetes.io/projected/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-kube-api-access-ht4dd\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.860475 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-catalog-content\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.860532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-utilities\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.920469 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wqtn7"] Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.921983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.932794 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqtn7"] Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.961472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-utilities\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.961613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht4dd\" (UniqueName: \"kubernetes.io/projected/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-kube-api-access-ht4dd\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.961643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-catalog-content\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.962066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-utilities\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.962097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-catalog-content\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:56 crc kubenswrapper[4749]: I0310 16:19:56.984450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht4dd\" (UniqueName: \"kubernetes.io/projected/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-kube-api-access-ht4dd\") pod \"community-operators-cdmbq\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.047875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.063030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-catalog-content\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.063308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87tg8\" (UniqueName: \"kubernetes.io/projected/de5a75e0-c933-4f2c-829f-a44ae64dc482-kube-api-access-87tg8\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.063358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-utilities\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.164591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-catalog-content\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.164636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87tg8\" (UniqueName: \"kubernetes.io/projected/de5a75e0-c933-4f2c-829f-a44ae64dc482-kube-api-access-87tg8\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.164694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-utilities\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.165276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-utilities\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.165556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-catalog-content\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.226164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87tg8\" (UniqueName: \"kubernetes.io/projected/de5a75e0-c933-4f2c-829f-a44ae64dc482-kube-api-access-87tg8\") pod \"certified-operators-wqtn7\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.239120 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.459432 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdmbq"] Mar 10 16:19:57 crc kubenswrapper[4749]: I0310 16:19:57.648507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqtn7"] Mar 10 16:19:57 crc kubenswrapper[4749]: W0310 16:19:57.713262 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5a75e0_c933_4f2c_829f_a44ae64dc482.slice/crio-3990e61e9a5e7938f891eeeda2a463147ebf9f1b687ce78424d6bd970aa7faa0 WatchSource:0}: Error finding container 3990e61e9a5e7938f891eeeda2a463147ebf9f1b687ce78424d6bd970aa7faa0: Status 404 returned error can't find the container with id 3990e61e9a5e7938f891eeeda2a463147ebf9f1b687ce78424d6bd970aa7faa0 Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.175627 4749 generic.go:334] "Generic (PLEG): container finished" podID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerID="4ca19c7e48e36895f316b452339617583379664ca9e4be69756844566d0f9998" exitCode=0 Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.175932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerDied","Data":"4ca19c7e48e36895f316b452339617583379664ca9e4be69756844566d0f9998"} Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.175964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerStarted","Data":"3990e61e9a5e7938f891eeeda2a463147ebf9f1b687ce78424d6bd970aa7faa0"} Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.177839 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.179704 4749 generic.go:334] "Generic (PLEG): container finished" podID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerID="22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b" exitCode=0 Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.179740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerDied","Data":"22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b"} Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.179766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerStarted","Data":"c60eaddc58bb15150eea474639daf2d24deaef654ec5701dbd9798f9c4549c95"} Mar 10 16:19:58 crc kubenswrapper[4749]: I0310 16:19:58.607563 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:19:58 crc kubenswrapper[4749]: E0310 16:19:58.608081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.189562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerStarted","Data":"cf54d3a738626ba017ac2388f71c4e26a7df855f02817f6419a9a2a1b124198e"} Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.191848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerStarted","Data":"99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0"} Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.325812 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vh8d2"] Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.327446 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.334904 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh8d2"] Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.414116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvl7\" (UniqueName: \"kubernetes.io/projected/c946100b-1bf3-4abe-b0ce-af1638f2bba0-kube-api-access-8zvl7\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.414273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-utilities\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.414367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-catalog-content\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.516021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-catalog-content\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.516123 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvl7\" (UniqueName: \"kubernetes.io/projected/c946100b-1bf3-4abe-b0ce-af1638f2bba0-kube-api-access-8zvl7\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.516201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-utilities\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.516745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-catalog-content\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.516780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-utilities\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.536683 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvl7\" (UniqueName: \"kubernetes.io/projected/c946100b-1bf3-4abe-b0ce-af1638f2bba0-kube-api-access-8zvl7\") pod \"redhat-operators-vh8d2\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:19:59 crc kubenswrapper[4749]: I0310 16:19:59.678667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.150755 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552660-r9m2m"] Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.152331 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:00 crc kubenswrapper[4749]: W0310 16:20:00.158074 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc946100b_1bf3_4abe_b0ce_af1638f2bba0.slice/crio-132b015a41edc4b59ff84db12e8c805ac543ae3c153763c5ef2533a7b79046e0 WatchSource:0}: Error finding container 132b015a41edc4b59ff84db12e8c805ac543ae3c153763c5ef2533a7b79046e0: Status 404 returned error can't find the container with id 132b015a41edc4b59ff84db12e8c805ac543ae3c153763c5ef2533a7b79046e0 Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.159392 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.159677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.161954 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.163560 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh8d2"] Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.170118 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-r9m2m"] Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.202033 4749 generic.go:334] "Generic (PLEG): container finished" podID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerID="cf54d3a738626ba017ac2388f71c4e26a7df855f02817f6419a9a2a1b124198e" exitCode=0 Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.202964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerDied","Data":"cf54d3a738626ba017ac2388f71c4e26a7df855f02817f6419a9a2a1b124198e"} Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.205560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerStarted","Data":"132b015a41edc4b59ff84db12e8c805ac543ae3c153763c5ef2533a7b79046e0"} Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.209290 4749 generic.go:334] "Generic (PLEG): container finished" podID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerID="99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0" exitCode=0 Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.209338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerDied","Data":"99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0"} Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.227109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jr52\" (UniqueName: \"kubernetes.io/projected/f563ee69-e442-4df7-b8f5-e0b58d48e00e-kube-api-access-5jr52\") pod \"auto-csr-approver-29552660-r9m2m\" (UID: \"f563ee69-e442-4df7-b8f5-e0b58d48e00e\") " pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.328579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jr52\" (UniqueName: \"kubernetes.io/projected/f563ee69-e442-4df7-b8f5-e0b58d48e00e-kube-api-access-5jr52\") pod \"auto-csr-approver-29552660-r9m2m\" (UID: \"f563ee69-e442-4df7-b8f5-e0b58d48e00e\") " pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.347213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jr52\" (UniqueName: \"kubernetes.io/projected/f563ee69-e442-4df7-b8f5-e0b58d48e00e-kube-api-access-5jr52\") pod \"auto-csr-approver-29552660-r9m2m\" (UID: \"f563ee69-e442-4df7-b8f5-e0b58d48e00e\") " pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:00 crc kubenswrapper[4749]: I0310 16:20:00.574036 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.064156 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-r9m2m"] Mar 10 16:20:01 crc kubenswrapper[4749]: W0310 16:20:01.065620 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf563ee69_e442_4df7_b8f5_e0b58d48e00e.slice/crio-487511b6b3b1b3f99ca33102751bbcd90f8505ca933b0e76b294a17f906f7b9d WatchSource:0}: Error finding container 487511b6b3b1b3f99ca33102751bbcd90f8505ca933b0e76b294a17f906f7b9d: Status 404 returned error can't find the container with id 487511b6b3b1b3f99ca33102751bbcd90f8505ca933b0e76b294a17f906f7b9d Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.217978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerStarted","Data":"7adac0fab0bd90470ae686bb7d6733c7b2f39815eb752b07745b0ef4ca789ced"} Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.219879 4749 generic.go:334] "Generic (PLEG): container finished" podID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerID="4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078" exitCode=0 Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.219946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerDied","Data":"4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078"} Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.223110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerStarted","Data":"5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0"} Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.225987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" event={"ID":"f563ee69-e442-4df7-b8f5-e0b58d48e00e","Type":"ContainerStarted","Data":"487511b6b3b1b3f99ca33102751bbcd90f8505ca933b0e76b294a17f906f7b9d"} Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.250088 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wqtn7" podStartSLOduration=2.642954617 podStartE2EDuration="5.250070976s" podCreationTimestamp="2026-03-10 16:19:56 +0000 UTC" firstStartedPulling="2026-03-10 16:19:58.177471659 +0000 UTC m=+1895.299337366" lastFinishedPulling="2026-03-10 16:20:00.784588048 +0000 UTC m=+1897.906453725" observedRunningTime="2026-03-10 16:20:01.248132543 +0000 UTC m=+1898.369998230" watchObservedRunningTime="2026-03-10 16:20:01.250070976 +0000 UTC m=+1898.371936663" Mar 10 16:20:01 crc kubenswrapper[4749]: I0310 16:20:01.335830 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cdmbq" podStartSLOduration=2.804106349 podStartE2EDuration="5.335795367s" podCreationTimestamp="2026-03-10 16:19:56 +0000 UTC" firstStartedPulling="2026-03-10 16:19:58.18114474 +0000 UTC m=+1895.303010437" lastFinishedPulling="2026-03-10 16:20:00.712833768 +0000 UTC m=+1897.834699455" observedRunningTime="2026-03-10 16:20:01.288055834 +0000 UTC m=+1898.409921531" watchObservedRunningTime="2026-03-10 16:20:01.335795367 +0000 UTC m=+1898.457661054" Mar 10 16:20:03 crc kubenswrapper[4749]: I0310 16:20:03.242451 4749 generic.go:334] "Generic (PLEG): container finished" podID="f563ee69-e442-4df7-b8f5-e0b58d48e00e" containerID="68e59d847a6ebfa9f44d4ff4f97e9ce4189678c91efb26361081c371dbdbd9af" exitCode=0 Mar 10 16:20:03 crc kubenswrapper[4749]: I0310 16:20:03.242510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" event={"ID":"f563ee69-e442-4df7-b8f5-e0b58d48e00e","Type":"ContainerDied","Data":"68e59d847a6ebfa9f44d4ff4f97e9ce4189678c91efb26361081c371dbdbd9af"} Mar 10 16:20:03 crc kubenswrapper[4749]: I0310 16:20:03.244997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerStarted","Data":"05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b"} Mar 10 16:20:04 crc kubenswrapper[4749]: I0310 16:20:04.259088 4749 generic.go:334] "Generic (PLEG): container finished" podID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerID="05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b" exitCode=0 Mar 10 16:20:04 crc kubenswrapper[4749]: I0310 16:20:04.259221 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerDied","Data":"05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b"} Mar 10 16:20:04 crc kubenswrapper[4749]: I0310 16:20:04.541179 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:04 crc kubenswrapper[4749]: I0310 16:20:04.702044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jr52\" (UniqueName: \"kubernetes.io/projected/f563ee69-e442-4df7-b8f5-e0b58d48e00e-kube-api-access-5jr52\") pod \"f563ee69-e442-4df7-b8f5-e0b58d48e00e\" (UID: \"f563ee69-e442-4df7-b8f5-e0b58d48e00e\") " Mar 10 16:20:04 crc kubenswrapper[4749]: I0310 16:20:04.708928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f563ee69-e442-4df7-b8f5-e0b58d48e00e-kube-api-access-5jr52" (OuterVolumeSpecName: "kube-api-access-5jr52") pod "f563ee69-e442-4df7-b8f5-e0b58d48e00e" (UID: "f563ee69-e442-4df7-b8f5-e0b58d48e00e"). InnerVolumeSpecName "kube-api-access-5jr52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:20:04 crc kubenswrapper[4749]: I0310 16:20:04.803469 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jr52\" (UniqueName: \"kubernetes.io/projected/f563ee69-e442-4df7-b8f5-e0b58d48e00e-kube-api-access-5jr52\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:05 crc kubenswrapper[4749]: I0310 16:20:05.269266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" event={"ID":"f563ee69-e442-4df7-b8f5-e0b58d48e00e","Type":"ContainerDied","Data":"487511b6b3b1b3f99ca33102751bbcd90f8505ca933b0e76b294a17f906f7b9d"} Mar 10 16:20:05 crc kubenswrapper[4749]: I0310 16:20:05.269305 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="487511b6b3b1b3f99ca33102751bbcd90f8505ca933b0e76b294a17f906f7b9d" Mar 10 16:20:05 crc kubenswrapper[4749]: I0310 16:20:05.269342 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552660-r9m2m" Mar 10 16:20:05 crc kubenswrapper[4749]: I0310 16:20:05.641831 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-cpwdb"] Mar 10 16:20:05 crc kubenswrapper[4749]: I0310 16:20:05.645599 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552654-cpwdb"] Mar 10 16:20:06 crc kubenswrapper[4749]: I0310 16:20:06.282750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerStarted","Data":"e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e"} Mar 10 16:20:06 crc kubenswrapper[4749]: I0310 16:20:06.307127 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vh8d2" podStartSLOduration=2.95563579 podStartE2EDuration="7.307105732s" podCreationTimestamp="2026-03-10 16:19:59 +0000 UTC" firstStartedPulling="2026-03-10 16:20:01.221719112 +0000 UTC m=+1898.343584799" lastFinishedPulling="2026-03-10 16:20:05.573189054 +0000 UTC m=+1902.695054741" observedRunningTime="2026-03-10 16:20:06.306044423 +0000 UTC m=+1903.427910130" watchObservedRunningTime="2026-03-10 16:20:06.307105732 +0000 UTC m=+1903.428971429" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.048602 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.049087 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.095239 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.241226 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.241294 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.297894 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.336188 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.345781 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:20:07 crc kubenswrapper[4749]: I0310 16:20:07.616498 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7445876f-74e1-4179-a304-c71e79e6d5d8" path="/var/lib/kubelet/pods/7445876f-74e1-4179-a304-c71e79e6d5d8/volumes" Mar 10 16:20:09 crc kubenswrapper[4749]: I0310 16:20:09.512271 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdmbq"] Mar 10 16:20:09 crc kubenswrapper[4749]: I0310 16:20:09.679761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:09 crc kubenswrapper[4749]: I0310 16:20:09.679860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:10 crc kubenswrapper[4749]: I0310 16:20:10.113300 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqtn7"] Mar 10 16:20:10 crc kubenswrapper[4749]: I0310 16:20:10.113576 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wqtn7" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="registry-server" containerID="cri-o://7adac0fab0bd90470ae686bb7d6733c7b2f39815eb752b07745b0ef4ca789ced" gracePeriod=2 Mar 10 16:20:10 crc kubenswrapper[4749]: I0310 16:20:10.308782 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cdmbq" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="registry-server" containerID="cri-o://5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0" gracePeriod=2 Mar 10 16:20:10 crc kubenswrapper[4749]: I0310 16:20:10.606797 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:20:10 crc kubenswrapper[4749]: E0310 16:20:10.607078 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:20:10 crc kubenswrapper[4749]: I0310 16:20:10.755154 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vh8d2" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="registry-server" probeResult="failure" output=< Mar 10 16:20:10 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 16:20:10 crc kubenswrapper[4749]: > Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.306396 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.323868 4749 generic.go:334] "Generic (PLEG): container finished" podID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerID="7adac0fab0bd90470ae686bb7d6733c7b2f39815eb752b07745b0ef4ca789ced" exitCode=0 Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.323949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerDied","Data":"7adac0fab0bd90470ae686bb7d6733c7b2f39815eb752b07745b0ef4ca789ced"} Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.327924 4749 generic.go:334] "Generic (PLEG): container finished" podID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerID="5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0" exitCode=0 Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.327966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerDied","Data":"5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0"} Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.328002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdmbq" event={"ID":"efb7b6e3-63ab-49c7-96cf-a4d45888fc28","Type":"ContainerDied","Data":"c60eaddc58bb15150eea474639daf2d24deaef654ec5701dbd9798f9c4549c95"} Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.328019 4749 scope.go:117] "RemoveContainer" containerID="5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.328159 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdmbq" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.364310 4749 scope.go:117] "RemoveContainer" containerID="99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.393343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-catalog-content\") pod \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.393440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht4dd\" (UniqueName: \"kubernetes.io/projected/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-kube-api-access-ht4dd\") pod \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.393528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-utilities\") pod \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\" (UID: \"efb7b6e3-63ab-49c7-96cf-a4d45888fc28\") " Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.394652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-utilities" (OuterVolumeSpecName: "utilities") pod "efb7b6e3-63ab-49c7-96cf-a4d45888fc28" (UID: "efb7b6e3-63ab-49c7-96cf-a4d45888fc28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.398664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-kube-api-access-ht4dd" (OuterVolumeSpecName: "kube-api-access-ht4dd") pod "efb7b6e3-63ab-49c7-96cf-a4d45888fc28" (UID: "efb7b6e3-63ab-49c7-96cf-a4d45888fc28"). InnerVolumeSpecName "kube-api-access-ht4dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.409130 4749 scope.go:117] "RemoveContainer" containerID="22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.438868 4749 scope.go:117] "RemoveContainer" containerID="5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0" Mar 10 16:20:11 crc kubenswrapper[4749]: E0310 16:20:11.439322 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0\": container with ID starting with 5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0 not found: ID does not exist" containerID="5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.439393 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0"} err="failed to get container status \"5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0\": rpc error: code = NotFound desc = could not find container \"5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0\": container with ID starting with 5836a17f46bce9647224ce15b7b28194e209fd2034ef76c8fd236db6c32194d0 not found: ID does not exist" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.439422 4749 scope.go:117] "RemoveContainer" containerID="99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0" Mar 10 16:20:11 crc kubenswrapper[4749]: E0310 16:20:11.441302 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0\": container with ID starting with 99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0 not found: ID does not exist" containerID="99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.441358 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0"} err="failed to get container status \"99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0\": rpc error: code = NotFound desc = could not find container \"99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0\": container with ID starting with 99ddb120d44b9d1c819d71bd40e5d46462c18398cb60fdd2aa78e3845e5807c0 not found: ID does not exist" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.441411 4749 scope.go:117] "RemoveContainer" containerID="22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b" Mar 10 16:20:11 crc kubenswrapper[4749]: E0310 16:20:11.442929 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b\": container with ID starting with 22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b not found: ID does not exist" containerID="22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.442961 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b"} err="failed to get container status \"22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b\": rpc error: code = NotFound desc = could not find container \"22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b\": container with ID starting with 22665903e7f35258ba446d9aad4b6e0eb250492d6f6dd2d1e5d8e7794e44e16b not found: ID does not exist" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.452622 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb7b6e3-63ab-49c7-96cf-a4d45888fc28" (UID: "efb7b6e3-63ab-49c7-96cf-a4d45888fc28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.494745 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.494776 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht4dd\" (UniqueName: \"kubernetes.io/projected/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-kube-api-access-ht4dd\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.494787 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb7b6e3-63ab-49c7-96cf-a4d45888fc28-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.666829 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdmbq"] Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.673307 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cdmbq"] Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.716074 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.798997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-catalog-content\") pod \"de5a75e0-c933-4f2c-829f-a44ae64dc482\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.799105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87tg8\" (UniqueName: \"kubernetes.io/projected/de5a75e0-c933-4f2c-829f-a44ae64dc482-kube-api-access-87tg8\") pod \"de5a75e0-c933-4f2c-829f-a44ae64dc482\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.799195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-utilities\") pod \"de5a75e0-c933-4f2c-829f-a44ae64dc482\" (UID: \"de5a75e0-c933-4f2c-829f-a44ae64dc482\") " Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.800299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-utilities" (OuterVolumeSpecName: "utilities") pod "de5a75e0-c933-4f2c-829f-a44ae64dc482" (UID: "de5a75e0-c933-4f2c-829f-a44ae64dc482"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.804411 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5a75e0-c933-4f2c-829f-a44ae64dc482-kube-api-access-87tg8" (OuterVolumeSpecName: "kube-api-access-87tg8") pod "de5a75e0-c933-4f2c-829f-a44ae64dc482" (UID: "de5a75e0-c933-4f2c-829f-a44ae64dc482"). InnerVolumeSpecName "kube-api-access-87tg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.859845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de5a75e0-c933-4f2c-829f-a44ae64dc482" (UID: "de5a75e0-c933-4f2c-829f-a44ae64dc482"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.900599 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.900636 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87tg8\" (UniqueName: \"kubernetes.io/projected/de5a75e0-c933-4f2c-829f-a44ae64dc482-kube-api-access-87tg8\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:11 crc kubenswrapper[4749]: I0310 16:20:11.900693 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de5a75e0-c933-4f2c-829f-a44ae64dc482-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.337126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqtn7" event={"ID":"de5a75e0-c933-4f2c-829f-a44ae64dc482","Type":"ContainerDied","Data":"3990e61e9a5e7938f891eeeda2a463147ebf9f1b687ce78424d6bd970aa7faa0"} Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.337192 4749 scope.go:117] "RemoveContainer" containerID="7adac0fab0bd90470ae686bb7d6733c7b2f39815eb752b07745b0ef4ca789ced" Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.337214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqtn7" Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.356850 4749 scope.go:117] "RemoveContainer" containerID="cf54d3a738626ba017ac2388f71c4e26a7df855f02817f6419a9a2a1b124198e" Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.382598 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqtn7"] Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.389940 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wqtn7"] Mar 10 16:20:12 crc kubenswrapper[4749]: I0310 16:20:12.393554 4749 scope.go:117] "RemoveContainer" containerID="4ca19c7e48e36895f316b452339617583379664ca9e4be69756844566d0f9998" Mar 10 16:20:13 crc kubenswrapper[4749]: I0310 16:20:13.620680 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" path="/var/lib/kubelet/pods/de5a75e0-c933-4f2c-829f-a44ae64dc482/volumes" Mar 10 16:20:13 crc kubenswrapper[4749]: I0310 16:20:13.622065 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" path="/var/lib/kubelet/pods/efb7b6e3-63ab-49c7-96cf-a4d45888fc28/volumes" Mar 10 16:20:19 crc kubenswrapper[4749]: I0310 16:20:19.751109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:19 crc kubenswrapper[4749]: I0310 16:20:19.806878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:19 crc kubenswrapper[4749]: I0310 16:20:19.994810 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh8d2"] Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.416784 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vh8d2" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="registry-server" containerID="cri-o://e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e" gracePeriod=2 Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.607696 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:20:21 crc kubenswrapper[4749]: E0310 16:20:21.608184 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.855337 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.945593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zvl7\" (UniqueName: \"kubernetes.io/projected/c946100b-1bf3-4abe-b0ce-af1638f2bba0-kube-api-access-8zvl7\") pod \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.945908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-utilities\") pod \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.945982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-catalog-content\") pod \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\" (UID: \"c946100b-1bf3-4abe-b0ce-af1638f2bba0\") " Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.946740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-utilities" (OuterVolumeSpecName: "utilities") pod "c946100b-1bf3-4abe-b0ce-af1638f2bba0" (UID: "c946100b-1bf3-4abe-b0ce-af1638f2bba0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.946991 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:21 crc kubenswrapper[4749]: I0310 16:20:21.950846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c946100b-1bf3-4abe-b0ce-af1638f2bba0-kube-api-access-8zvl7" (OuterVolumeSpecName: "kube-api-access-8zvl7") pod "c946100b-1bf3-4abe-b0ce-af1638f2bba0" (UID: "c946100b-1bf3-4abe-b0ce-af1638f2bba0"). InnerVolumeSpecName "kube-api-access-8zvl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.049215 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zvl7\" (UniqueName: \"kubernetes.io/projected/c946100b-1bf3-4abe-b0ce-af1638f2bba0-kube-api-access-8zvl7\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.105838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c946100b-1bf3-4abe-b0ce-af1638f2bba0" (UID: "c946100b-1bf3-4abe-b0ce-af1638f2bba0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.149958 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c946100b-1bf3-4abe-b0ce-af1638f2bba0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.430459 4749 generic.go:334] "Generic (PLEG): container finished" podID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerID="e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e" exitCode=0 Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.430520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerDied","Data":"e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e"} Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.430551 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh8d2" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.430565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh8d2" event={"ID":"c946100b-1bf3-4abe-b0ce-af1638f2bba0","Type":"ContainerDied","Data":"132b015a41edc4b59ff84db12e8c805ac543ae3c153763c5ef2533a7b79046e0"} Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.430602 4749 scope.go:117] "RemoveContainer" containerID="e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.481156 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh8d2"] Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.481554 4749 scope.go:117] "RemoveContainer" containerID="05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.485592 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vh8d2"] Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.514604 4749 scope.go:117] "RemoveContainer" containerID="4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.543308 4749 scope.go:117] "RemoveContainer" containerID="e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e" Mar 10 16:20:22 crc kubenswrapper[4749]: E0310 16:20:22.543954 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e\": container with ID starting with e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e not found: ID does not exist" containerID="e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.543990 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e"} err="failed to get container status \"e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e\": rpc error: code = NotFound desc = could not find container \"e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e\": container with ID starting with e505b4c31917ca3906ab47a8c9d390dcb63dd46bb418f03cdc5f59b3b362894e not found: ID does not exist" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.544019 4749 scope.go:117] "RemoveContainer" containerID="05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b" Mar 10 16:20:22 crc kubenswrapper[4749]: E0310 16:20:22.544822 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b\": container with ID starting with 05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b not found: ID does not exist" containerID="05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.544870 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b"} err="failed to get container status \"05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b\": rpc error: code = NotFound desc = could not find container \"05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b\": container with ID starting with 05119f5bc1b661f4482fd93397d89e51e007367fb00609cd722b83981fe6fd9b not found: ID does not exist" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.544905 4749 scope.go:117] "RemoveContainer" containerID="4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078" Mar 10 16:20:22 crc kubenswrapper[4749]: E0310 16:20:22.545480 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078\": container with ID starting with 4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078 not found: ID does not exist" containerID="4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078" Mar 10 16:20:22 crc kubenswrapper[4749]: I0310 16:20:22.545513 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078"} err="failed to get container status \"4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078\": rpc error: code = NotFound desc = could not find container \"4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078\": container with ID starting with 4bc99f0118f09be29be0727921eba112de1ed184be73a3bf0da10f0bf5ffb078 not found: ID does not exist" Mar 10 16:20:23 crc kubenswrapper[4749]: I0310 16:20:23.625180 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" path="/var/lib/kubelet/pods/c946100b-1bf3-4abe-b0ce-af1638f2bba0/volumes" Mar 10 16:20:23 crc kubenswrapper[4749]: I0310 16:20:23.991887 4749 scope.go:117] "RemoveContainer" containerID="6158da3960733661287c4fa53fec91bc77be2aecd94c8547723954fe2ad29c49" Mar 10 16:20:35 crc kubenswrapper[4749]: I0310 16:20:35.607024 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:20:35 crc kubenswrapper[4749]: E0310 16:20:35.608164 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:20:46 crc kubenswrapper[4749]: I0310 16:20:46.606916 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:20:46 crc kubenswrapper[4749]: E0310 16:20:46.607857 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:21:01 crc kubenswrapper[4749]: I0310 16:21:01.606951 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:21:01 crc kubenswrapper[4749]: E0310 16:21:01.607673 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:21:13 crc kubenswrapper[4749]: I0310 16:21:13.613499 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:21:13 crc kubenswrapper[4749]: E0310 16:21:13.614699 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:21:28 crc kubenswrapper[4749]: I0310 16:21:28.606369 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:21:28 crc kubenswrapper[4749]: E0310 16:21:28.607434 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:21:43 crc kubenswrapper[4749]: I0310 16:21:43.614330 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:21:43 crc kubenswrapper[4749]: E0310 16:21:43.628692 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:21:56 crc kubenswrapper[4749]: I0310 16:21:56.607002 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:21:56 crc kubenswrapper[4749]: E0310 16:21:56.607732 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.158711 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552662-6kqd2"] Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.159833 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="extract-utilities" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.159859 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="extract-utilities" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.159884 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="extract-content" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.159897 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="extract-content" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.159935 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="extract-content" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.159949 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="extract-content" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.159970 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.159984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.160001 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160014 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.160036 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160050 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.160077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="extract-content" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160090 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="extract-content" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.160113 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="extract-utilities" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160126 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="extract-utilities" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.160152 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f563ee69-e442-4df7-b8f5-e0b58d48e00e" containerName="oc" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160166 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f563ee69-e442-4df7-b8f5-e0b58d48e00e" containerName="oc" Mar 10 16:22:00 crc kubenswrapper[4749]: E0310 16:22:00.160190 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="extract-utilities" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160203 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="extract-utilities" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160598 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c946100b-1bf3-4abe-b0ce-af1638f2bba0" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160634 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb7b6e3-63ab-49c7-96cf-a4d45888fc28" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160666 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f563ee69-e442-4df7-b8f5-e0b58d48e00e" containerName="oc" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.160709 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5a75e0-c933-4f2c-829f-a44ae64dc482" containerName="registry-server" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.161715 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.164224 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.164671 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.166166 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.174849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-6kqd2"] Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.340329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68t5\" (UniqueName: \"kubernetes.io/projected/c56f3c78-8506-41fb-9385-33676fd1d66d-kube-api-access-w68t5\") pod \"auto-csr-approver-29552662-6kqd2\" (UID: \"c56f3c78-8506-41fb-9385-33676fd1d66d\") " pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.441708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68t5\" (UniqueName: \"kubernetes.io/projected/c56f3c78-8506-41fb-9385-33676fd1d66d-kube-api-access-w68t5\") pod \"auto-csr-approver-29552662-6kqd2\" (UID: \"c56f3c78-8506-41fb-9385-33676fd1d66d\") " pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.464515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68t5\" (UniqueName: \"kubernetes.io/projected/c56f3c78-8506-41fb-9385-33676fd1d66d-kube-api-access-w68t5\") pod \"auto-csr-approver-29552662-6kqd2\" (UID: \"c56f3c78-8506-41fb-9385-33676fd1d66d\") " pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.496020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:00 crc kubenswrapper[4749]: I0310 16:22:00.970790 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-6kqd2"] Mar 10 16:22:01 crc kubenswrapper[4749]: I0310 16:22:01.273912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" event={"ID":"c56f3c78-8506-41fb-9385-33676fd1d66d","Type":"ContainerStarted","Data":"9e724b95a7acb2232ab0d6731b4f1df6815cc297c569457482f9cc66ec2ff47f"} Mar 10 16:22:03 crc kubenswrapper[4749]: I0310 16:22:03.294092 4749 generic.go:334] "Generic (PLEG): container finished" podID="c56f3c78-8506-41fb-9385-33676fd1d66d" containerID="4102c36ba098f77a65a6ad3eb3d29ff1663705100d411957f4483ac9d392364c" exitCode=0 Mar 10 16:22:03 crc kubenswrapper[4749]: I0310 16:22:03.294188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" event={"ID":"c56f3c78-8506-41fb-9385-33676fd1d66d","Type":"ContainerDied","Data":"4102c36ba098f77a65a6ad3eb3d29ff1663705100d411957f4483ac9d392364c"} Mar 10 16:22:04 crc kubenswrapper[4749]: I0310 16:22:04.605196 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:04 crc kubenswrapper[4749]: I0310 16:22:04.715984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w68t5\" (UniqueName: \"kubernetes.io/projected/c56f3c78-8506-41fb-9385-33676fd1d66d-kube-api-access-w68t5\") pod \"c56f3c78-8506-41fb-9385-33676fd1d66d\" (UID: \"c56f3c78-8506-41fb-9385-33676fd1d66d\") " Mar 10 16:22:04 crc kubenswrapper[4749]: I0310 16:22:04.722769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56f3c78-8506-41fb-9385-33676fd1d66d-kube-api-access-w68t5" (OuterVolumeSpecName: "kube-api-access-w68t5") pod "c56f3c78-8506-41fb-9385-33676fd1d66d" (UID: "c56f3c78-8506-41fb-9385-33676fd1d66d"). InnerVolumeSpecName "kube-api-access-w68t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:22:04 crc kubenswrapper[4749]: I0310 16:22:04.817901 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w68t5\" (UniqueName: \"kubernetes.io/projected/c56f3c78-8506-41fb-9385-33676fd1d66d-kube-api-access-w68t5\") on node \"crc\" DevicePath \"\"" Mar 10 16:22:05 crc kubenswrapper[4749]: I0310 16:22:05.308545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" event={"ID":"c56f3c78-8506-41fb-9385-33676fd1d66d","Type":"ContainerDied","Data":"9e724b95a7acb2232ab0d6731b4f1df6815cc297c569457482f9cc66ec2ff47f"} Mar 10 16:22:05 crc kubenswrapper[4749]: I0310 16:22:05.308587 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e724b95a7acb2232ab0d6731b4f1df6815cc297c569457482f9cc66ec2ff47f" Mar 10 16:22:05 crc kubenswrapper[4749]: I0310 16:22:05.308655 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552662-6kqd2" Mar 10 16:22:05 crc kubenswrapper[4749]: I0310 16:22:05.692153 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-rb8kq"] Mar 10 16:22:05 crc kubenswrapper[4749]: I0310 16:22:05.699344 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552656-rb8kq"] Mar 10 16:22:07 crc kubenswrapper[4749]: I0310 16:22:07.618104 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f53359-bd54-4ae2-ac09-adf68d7a9eb8" path="/var/lib/kubelet/pods/52f53359-bd54-4ae2-ac09-adf68d7a9eb8/volumes" Mar 10 16:22:08 crc kubenswrapper[4749]: I0310 16:22:08.607172 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:22:08 crc kubenswrapper[4749]: E0310 16:22:08.607813 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:22:23 crc kubenswrapper[4749]: I0310 16:22:23.615178 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:22:23 crc kubenswrapper[4749]: E0310 16:22:23.616278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:22:24 crc kubenswrapper[4749]: I0310 16:22:24.134847 4749 scope.go:117] "RemoveContainer" containerID="f502e5e7f13f66721039f9643ff5a797150bcbce4483e694ed6659e0d94b0f12" Mar 10 16:22:37 crc kubenswrapper[4749]: I0310 16:22:37.606982 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:22:37 crc kubenswrapper[4749]: E0310 16:22:37.607708 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:22:48 crc kubenswrapper[4749]: I0310 16:22:48.606124 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:22:48 crc kubenswrapper[4749]: E0310 16:22:48.608480 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:23:00 crc kubenswrapper[4749]: I0310 16:23:00.607844 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:23:00 crc kubenswrapper[4749]: I0310 16:23:00.815636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"b7adaaac49c72dbb9804e4450407d6b2a442880a14ae61fa8bea5a508b2de8ea"} Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.148035 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552664-hprkd"] Mar 10 16:24:00 crc kubenswrapper[4749]: E0310 16:24:00.148855 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56f3c78-8506-41fb-9385-33676fd1d66d" containerName="oc" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.148869 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56f3c78-8506-41fb-9385-33676fd1d66d" containerName="oc" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.149025 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56f3c78-8506-41fb-9385-33676fd1d66d" containerName="oc" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.149586 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.151758 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.154085 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.154268 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.163214 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-hprkd"] Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.291367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd826\" (UniqueName: \"kubernetes.io/projected/34e83e07-cfae-4232-8fcb-945eca1c4425-kube-api-access-xd826\") pod \"auto-csr-approver-29552664-hprkd\" (UID: \"34e83e07-cfae-4232-8fcb-945eca1c4425\") " pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.392612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd826\" (UniqueName: \"kubernetes.io/projected/34e83e07-cfae-4232-8fcb-945eca1c4425-kube-api-access-xd826\") pod \"auto-csr-approver-29552664-hprkd\" (UID: \"34e83e07-cfae-4232-8fcb-945eca1c4425\") " pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.429437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd826\" (UniqueName: \"kubernetes.io/projected/34e83e07-cfae-4232-8fcb-945eca1c4425-kube-api-access-xd826\") pod \"auto-csr-approver-29552664-hprkd\" (UID: \"34e83e07-cfae-4232-8fcb-945eca1c4425\") " pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.475556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:00 crc kubenswrapper[4749]: I0310 16:24:00.910224 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-hprkd"] Mar 10 16:24:01 crc kubenswrapper[4749]: I0310 16:24:01.306203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552664-hprkd" event={"ID":"34e83e07-cfae-4232-8fcb-945eca1c4425","Type":"ContainerStarted","Data":"9f99e04668cb81a89baf1e01b42df189e268105500d6ed267efaa13fa75bfa14"} Mar 10 16:24:03 crc kubenswrapper[4749]: I0310 16:24:03.324259 4749 generic.go:334] "Generic (PLEG): container finished" podID="34e83e07-cfae-4232-8fcb-945eca1c4425" containerID="f5de8894560d11e1ad6786a4a95ca31c29d7a02cf4b613ae120a657fe64ab518" exitCode=0 Mar 10 16:24:03 crc kubenswrapper[4749]: I0310 16:24:03.324546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552664-hprkd" event={"ID":"34e83e07-cfae-4232-8fcb-945eca1c4425","Type":"ContainerDied","Data":"f5de8894560d11e1ad6786a4a95ca31c29d7a02cf4b613ae120a657fe64ab518"} Mar 10 16:24:04 crc kubenswrapper[4749]: I0310 16:24:04.620963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:04 crc kubenswrapper[4749]: I0310 16:24:04.758538 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd826\" (UniqueName: \"kubernetes.io/projected/34e83e07-cfae-4232-8fcb-945eca1c4425-kube-api-access-xd826\") pod \"34e83e07-cfae-4232-8fcb-945eca1c4425\" (UID: \"34e83e07-cfae-4232-8fcb-945eca1c4425\") " Mar 10 16:24:04 crc kubenswrapper[4749]: I0310 16:24:04.769733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e83e07-cfae-4232-8fcb-945eca1c4425-kube-api-access-xd826" (OuterVolumeSpecName: "kube-api-access-xd826") pod "34e83e07-cfae-4232-8fcb-945eca1c4425" (UID: "34e83e07-cfae-4232-8fcb-945eca1c4425"). InnerVolumeSpecName "kube-api-access-xd826". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:24:04 crc kubenswrapper[4749]: I0310 16:24:04.859877 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd826\" (UniqueName: \"kubernetes.io/projected/34e83e07-cfae-4232-8fcb-945eca1c4425-kube-api-access-xd826\") on node \"crc\" DevicePath \"\"" Mar 10 16:24:05 crc kubenswrapper[4749]: I0310 16:24:05.342521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552664-hprkd" event={"ID":"34e83e07-cfae-4232-8fcb-945eca1c4425","Type":"ContainerDied","Data":"9f99e04668cb81a89baf1e01b42df189e268105500d6ed267efaa13fa75bfa14"} Mar 10 16:24:05 crc kubenswrapper[4749]: I0310 16:24:05.342826 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f99e04668cb81a89baf1e01b42df189e268105500d6ed267efaa13fa75bfa14" Mar 10 16:24:05 crc kubenswrapper[4749]: I0310 16:24:05.342540 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552664-hprkd" Mar 10 16:24:05 crc kubenswrapper[4749]: I0310 16:24:05.679627 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-zchgs"] Mar 10 16:24:05 crc kubenswrapper[4749]: I0310 16:24:05.686181 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552658-zchgs"] Mar 10 16:24:07 crc kubenswrapper[4749]: I0310 16:24:07.621220 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802a5c46-b864-40c5-80d5-28dde91eaa3a" path="/var/lib/kubelet/pods/802a5c46-b864-40c5-80d5-28dde91eaa3a/volumes" Mar 10 16:24:24 crc kubenswrapper[4749]: I0310 16:24:24.225784 4749 scope.go:117] "RemoveContainer" containerID="cc530f1797b3e1b061394005ac4d9b380ef006602c5fbd08c91af5c4d9a6213d" Mar 10 16:25:20 crc kubenswrapper[4749]: I0310 16:25:20.980425 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:25:20 crc kubenswrapper[4749]: I0310 16:25:20.981094 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:25:50 crc kubenswrapper[4749]: I0310 16:25:50.980560 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:25:50 crc kubenswrapper[4749]: I0310 16:25:50.981251 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.137424 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552666-skz2g"] Mar 10 16:26:00 crc kubenswrapper[4749]: E0310 16:26:00.138287 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e83e07-cfae-4232-8fcb-945eca1c4425" containerName="oc" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.138305 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e83e07-cfae-4232-8fcb-945eca1c4425" containerName="oc" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.138485 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e83e07-cfae-4232-8fcb-945eca1c4425" containerName="oc" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.138933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.141536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.141683 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.143892 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.148263 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-skz2g"] Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.252629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nph7d\" (UniqueName: \"kubernetes.io/projected/2c6d767a-41c8-4cbf-a205-e89b2fadd947-kube-api-access-nph7d\") pod \"auto-csr-approver-29552666-skz2g\" (UID: \"2c6d767a-41c8-4cbf-a205-e89b2fadd947\") " pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.354002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nph7d\" (UniqueName: \"kubernetes.io/projected/2c6d767a-41c8-4cbf-a205-e89b2fadd947-kube-api-access-nph7d\") pod \"auto-csr-approver-29552666-skz2g\" (UID: \"2c6d767a-41c8-4cbf-a205-e89b2fadd947\") " pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.379219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nph7d\" (UniqueName: \"kubernetes.io/projected/2c6d767a-41c8-4cbf-a205-e89b2fadd947-kube-api-access-nph7d\") pod \"auto-csr-approver-29552666-skz2g\" (UID: \"2c6d767a-41c8-4cbf-a205-e89b2fadd947\") " pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.456760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.892460 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-skz2g"] Mar 10 16:26:00 crc kubenswrapper[4749]: I0310 16:26:00.902777 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:26:01 crc kubenswrapper[4749]: I0310 16:26:01.255017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-skz2g" event={"ID":"2c6d767a-41c8-4cbf-a205-e89b2fadd947","Type":"ContainerStarted","Data":"b2f822f8e41d58d4fa5eab633e016e6987d33dea24d183912ef40e1a2b71f0fb"} Mar 10 16:26:02 crc kubenswrapper[4749]: I0310 16:26:02.261225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-skz2g" event={"ID":"2c6d767a-41c8-4cbf-a205-e89b2fadd947","Type":"ContainerStarted","Data":"3d2f5b5a5f5c50d3e7437c02bab723d3c5cf8e4338e344594f00b1577c0b0121"} Mar 10 16:26:02 crc kubenswrapper[4749]: I0310 16:26:02.277157 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552666-skz2g" podStartSLOduration=1.245256473 podStartE2EDuration="2.27714274s" podCreationTimestamp="2026-03-10 16:26:00 +0000 UTC" firstStartedPulling="2026-03-10 16:26:00.902470859 +0000 UTC m=+2258.024336556" lastFinishedPulling="2026-03-10 16:26:01.934357146 +0000 UTC m=+2259.056222823" observedRunningTime="2026-03-10 16:26:02.273399327 +0000 UTC m=+2259.395265014" watchObservedRunningTime="2026-03-10 16:26:02.27714274 +0000 UTC m=+2259.399008427" Mar 10 16:26:03 crc kubenswrapper[4749]: I0310 16:26:03.269995 4749 generic.go:334] "Generic (PLEG): container finished" podID="2c6d767a-41c8-4cbf-a205-e89b2fadd947" containerID="3d2f5b5a5f5c50d3e7437c02bab723d3c5cf8e4338e344594f00b1577c0b0121" exitCode=0 Mar 10 16:26:03 crc kubenswrapper[4749]: I0310 16:26:03.270125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-skz2g" event={"ID":"2c6d767a-41c8-4cbf-a205-e89b2fadd947","Type":"ContainerDied","Data":"3d2f5b5a5f5c50d3e7437c02bab723d3c5cf8e4338e344594f00b1577c0b0121"} Mar 10 16:26:04 crc kubenswrapper[4749]: I0310 16:26:04.612624 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:04 crc kubenswrapper[4749]: I0310 16:26:04.814339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nph7d\" (UniqueName: \"kubernetes.io/projected/2c6d767a-41c8-4cbf-a205-e89b2fadd947-kube-api-access-nph7d\") pod \"2c6d767a-41c8-4cbf-a205-e89b2fadd947\" (UID: \"2c6d767a-41c8-4cbf-a205-e89b2fadd947\") " Mar 10 16:26:04 crc kubenswrapper[4749]: I0310 16:26:04.840748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6d767a-41c8-4cbf-a205-e89b2fadd947-kube-api-access-nph7d" (OuterVolumeSpecName: "kube-api-access-nph7d") pod "2c6d767a-41c8-4cbf-a205-e89b2fadd947" (UID: "2c6d767a-41c8-4cbf-a205-e89b2fadd947"). InnerVolumeSpecName "kube-api-access-nph7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:26:04 crc kubenswrapper[4749]: I0310 16:26:04.916421 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nph7d\" (UniqueName: \"kubernetes.io/projected/2c6d767a-41c8-4cbf-a205-e89b2fadd947-kube-api-access-nph7d\") on node \"crc\" DevicePath \"\"" Mar 10 16:26:05 crc kubenswrapper[4749]: I0310 16:26:05.283483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552666-skz2g" event={"ID":"2c6d767a-41c8-4cbf-a205-e89b2fadd947","Type":"ContainerDied","Data":"b2f822f8e41d58d4fa5eab633e016e6987d33dea24d183912ef40e1a2b71f0fb"} Mar 10 16:26:05 crc kubenswrapper[4749]: I0310 16:26:05.283532 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f822f8e41d58d4fa5eab633e016e6987d33dea24d183912ef40e1a2b71f0fb" Mar 10 16:26:05 crc kubenswrapper[4749]: I0310 16:26:05.283630 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552666-skz2g" Mar 10 16:26:05 crc kubenswrapper[4749]: I0310 16:26:05.353772 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-r9m2m"] Mar 10 16:26:05 crc kubenswrapper[4749]: I0310 16:26:05.365915 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552660-r9m2m"] Mar 10 16:26:05 crc kubenswrapper[4749]: I0310 16:26:05.614647 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f563ee69-e442-4df7-b8f5-e0b58d48e00e" path="/var/lib/kubelet/pods/f563ee69-e442-4df7-b8f5-e0b58d48e00e/volumes" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.442666 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r9ks4"] Mar 10 16:26:08 crc kubenswrapper[4749]: E0310 16:26:08.443331 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6d767a-41c8-4cbf-a205-e89b2fadd947" containerName="oc" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.443348 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6d767a-41c8-4cbf-a205-e89b2fadd947" containerName="oc" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.443573 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6d767a-41c8-4cbf-a205-e89b2fadd947" containerName="oc" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.444770 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.460905 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9ks4"] Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.568293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl2qq\" (UniqueName: \"kubernetes.io/projected/df5c7297-e3ae-4210-b571-ab7fd3dd891b-kube-api-access-rl2qq\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.568398 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-utilities\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.568430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-catalog-content\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.670559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl2qq\" (UniqueName: \"kubernetes.io/projected/df5c7297-e3ae-4210-b571-ab7fd3dd891b-kube-api-access-rl2qq\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.670654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-utilities\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.670686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-catalog-content\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.671214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-utilities\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.671411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-catalog-content\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.700457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl2qq\" (UniqueName: \"kubernetes.io/projected/df5c7297-e3ae-4210-b571-ab7fd3dd891b-kube-api-access-rl2qq\") pod \"redhat-marketplace-r9ks4\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:08 crc kubenswrapper[4749]: I0310 16:26:08.782278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:09 crc kubenswrapper[4749]: I0310 16:26:09.230771 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9ks4"] Mar 10 16:26:09 crc kubenswrapper[4749]: W0310 16:26:09.240237 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf5c7297_e3ae_4210_b571_ab7fd3dd891b.slice/crio-3e533e0ee638951db7a8440b380bd4eb6e9f8516ce83c5af39db0e3883f71dcf WatchSource:0}: Error finding container 3e533e0ee638951db7a8440b380bd4eb6e9f8516ce83c5af39db0e3883f71dcf: Status 404 returned error can't find the container with id 3e533e0ee638951db7a8440b380bd4eb6e9f8516ce83c5af39db0e3883f71dcf Mar 10 16:26:09 crc kubenswrapper[4749]: I0310 16:26:09.315748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9ks4" event={"ID":"df5c7297-e3ae-4210-b571-ab7fd3dd891b","Type":"ContainerStarted","Data":"3e533e0ee638951db7a8440b380bd4eb6e9f8516ce83c5af39db0e3883f71dcf"} Mar 10 16:26:10 crc kubenswrapper[4749]: I0310 16:26:10.325100 4749 generic.go:334] "Generic (PLEG): container finished" podID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerID="5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a" exitCode=0 Mar 10 16:26:10 crc kubenswrapper[4749]: I0310 16:26:10.325170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9ks4" event={"ID":"df5c7297-e3ae-4210-b571-ab7fd3dd891b","Type":"ContainerDied","Data":"5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a"} Mar 10 16:26:11 crc kubenswrapper[4749]: I0310 16:26:11.333925 4749 generic.go:334] "Generic (PLEG): container finished" podID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerID="0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2" exitCode=0 Mar 10 16:26:11 crc kubenswrapper[4749]: I0310 16:26:11.333965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9ks4" event={"ID":"df5c7297-e3ae-4210-b571-ab7fd3dd891b","Type":"ContainerDied","Data":"0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2"} Mar 10 16:26:12 crc kubenswrapper[4749]: I0310 16:26:12.343876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9ks4" event={"ID":"df5c7297-e3ae-4210-b571-ab7fd3dd891b","Type":"ContainerStarted","Data":"31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad"} Mar 10 16:26:12 crc kubenswrapper[4749]: I0310 16:26:12.365137 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r9ks4" podStartSLOduration=2.904728837 podStartE2EDuration="4.365112622s" podCreationTimestamp="2026-03-10 16:26:08 +0000 UTC" firstStartedPulling="2026-03-10 16:26:10.327280845 +0000 UTC m=+2267.449146532" lastFinishedPulling="2026-03-10 16:26:11.78766463 +0000 UTC m=+2268.909530317" observedRunningTime="2026-03-10 16:26:12.364093953 +0000 UTC m=+2269.485959650" watchObservedRunningTime="2026-03-10 16:26:12.365112622 +0000 UTC m=+2269.486978329" Mar 10 16:26:18 crc kubenswrapper[4749]: I0310 16:26:18.782971 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:18 crc kubenswrapper[4749]: I0310 16:26:18.783923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:18 crc kubenswrapper[4749]: I0310 16:26:18.854180 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:19 crc kubenswrapper[4749]: I0310 16:26:19.460244 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:19 crc kubenswrapper[4749]: I0310 16:26:19.523281 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9ks4"] Mar 10 16:26:20 crc kubenswrapper[4749]: I0310 16:26:20.980815 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:26:20 crc kubenswrapper[4749]: I0310 16:26:20.980900 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:26:20 crc kubenswrapper[4749]: I0310 16:26:20.980958 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:26:20 crc kubenswrapper[4749]: I0310 16:26:20.981799 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7adaaac49c72dbb9804e4450407d6b2a442880a14ae61fa8bea5a508b2de8ea"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:26:20 crc kubenswrapper[4749]: I0310 16:26:20.981890 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://b7adaaac49c72dbb9804e4450407d6b2a442880a14ae61fa8bea5a508b2de8ea" gracePeriod=600 Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.428810 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="b7adaaac49c72dbb9804e4450407d6b2a442880a14ae61fa8bea5a508b2de8ea" exitCode=0 Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.429207 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r9ks4" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="registry-server" containerID="cri-o://31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad" gracePeriod=2 Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.428890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"b7adaaac49c72dbb9804e4450407d6b2a442880a14ae61fa8bea5a508b2de8ea"} Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.429302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a"} Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.429321 4749 scope.go:117] "RemoveContainer" containerID="ef9c36cb500205a4821b781923c2f9969b2157bd048532e7b4fe73a3c6aa84aa" Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.919938 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.975582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-catalog-content\") pod \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.975649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-utilities\") pod \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.975674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl2qq\" (UniqueName: \"kubernetes.io/projected/df5c7297-e3ae-4210-b571-ab7fd3dd891b-kube-api-access-rl2qq\") pod \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\" (UID: \"df5c7297-e3ae-4210-b571-ab7fd3dd891b\") " Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.976936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-utilities" (OuterVolumeSpecName: "utilities") pod "df5c7297-e3ae-4210-b571-ab7fd3dd891b" (UID: "df5c7297-e3ae-4210-b571-ab7fd3dd891b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:26:21 crc kubenswrapper[4749]: I0310 16:26:21.981909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5c7297-e3ae-4210-b571-ab7fd3dd891b-kube-api-access-rl2qq" (OuterVolumeSpecName: "kube-api-access-rl2qq") pod "df5c7297-e3ae-4210-b571-ab7fd3dd891b" (UID: "df5c7297-e3ae-4210-b571-ab7fd3dd891b"). InnerVolumeSpecName "kube-api-access-rl2qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.077086 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.077348 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl2qq\" (UniqueName: \"kubernetes.io/projected/df5c7297-e3ae-4210-b571-ab7fd3dd891b-kube-api-access-rl2qq\") on node \"crc\" DevicePath \"\"" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.183661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df5c7297-e3ae-4210-b571-ab7fd3dd891b" (UID: "df5c7297-e3ae-4210-b571-ab7fd3dd891b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.280204 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5c7297-e3ae-4210-b571-ab7fd3dd891b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.446731 4749 generic.go:334] "Generic (PLEG): container finished" podID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerID="31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad" exitCode=0 Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.446800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9ks4" event={"ID":"df5c7297-e3ae-4210-b571-ab7fd3dd891b","Type":"ContainerDied","Data":"31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad"} Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.446849 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9ks4" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.447085 4749 scope.go:117] "RemoveContainer" containerID="31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.447067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9ks4" event={"ID":"df5c7297-e3ae-4210-b571-ab7fd3dd891b","Type":"ContainerDied","Data":"3e533e0ee638951db7a8440b380bd4eb6e9f8516ce83c5af39db0e3883f71dcf"} Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.480776 4749 scope.go:117] "RemoveContainer" containerID="0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.507559 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9ks4"] Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.517421 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9ks4"] Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.538670 4749 scope.go:117] "RemoveContainer" containerID="5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.562694 4749 scope.go:117] "RemoveContainer" containerID="31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad" Mar 10 16:26:22 crc kubenswrapper[4749]: E0310 16:26:22.563116 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad\": container with ID starting with 31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad not found: ID does not exist" containerID="31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.563172 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad"} err="failed to get container status \"31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad\": rpc error: code = NotFound desc = could not find container \"31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad\": container with ID starting with 31a2adc0ed21f8c3cec7638e5983e8317a3ea459ec1132aa3c4f08e8d7317fad not found: ID does not exist" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.563213 4749 scope.go:117] "RemoveContainer" containerID="0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2" Mar 10 16:26:22 crc kubenswrapper[4749]: E0310 16:26:22.563745 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2\": container with ID starting with 0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2 not found: ID does not exist" containerID="0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.563783 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2"} err="failed to get container status \"0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2\": rpc error: code = NotFound desc = could not find container \"0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2\": container with ID starting with 0d1bc4dafb78775188c8f5b8f8de6a96310b26e42b8e515410a535e899ba77e2 not found: ID does not exist" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.563811 4749 scope.go:117] "RemoveContainer" containerID="5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a" Mar 10 16:26:22 crc kubenswrapper[4749]: E0310 16:26:22.564313 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a\": container with ID starting with 5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a not found: ID does not exist" containerID="5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a" Mar 10 16:26:22 crc kubenswrapper[4749]: I0310 16:26:22.564346 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a"} err="failed to get container status \"5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a\": rpc error: code = NotFound desc = could not find container \"5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a\": container with ID starting with 5994cca84f4a21527e8243a09c42e44c3a5f8b0d8a316a04dd4a08c2ed55f17a not found: ID does not exist" Mar 10 16:26:23 crc kubenswrapper[4749]: I0310 16:26:23.623584 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" path="/var/lib/kubelet/pods/df5c7297-e3ae-4210-b571-ab7fd3dd891b/volumes" Mar 10 16:26:24 crc kubenswrapper[4749]: I0310 16:26:24.313434 4749 scope.go:117] "RemoveContainer" containerID="68e59d847a6ebfa9f44d4ff4f97e9ce4189678c91efb26361081c371dbdbd9af" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.151331 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552668-zpf6c"] Mar 10 16:28:00 crc kubenswrapper[4749]: E0310 16:28:00.152142 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="registry-server" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.152156 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="registry-server" Mar 10 16:28:00 crc kubenswrapper[4749]: E0310 16:28:00.152168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="extract-content" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.152174 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="extract-content" Mar 10 16:28:00 crc kubenswrapper[4749]: E0310 16:28:00.152202 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="extract-utilities" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.152210 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="extract-utilities" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.152397 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5c7297-e3ae-4210-b571-ab7fd3dd891b" containerName="registry-server" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.152855 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.155462 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.155544 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.155596 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.174387 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-zpf6c"] Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.206482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqx8\" (UniqueName: \"kubernetes.io/projected/c34e624f-99ae-4766-81df-991a7b3c882c-kube-api-access-kmqx8\") pod \"auto-csr-approver-29552668-zpf6c\" (UID: \"c34e624f-99ae-4766-81df-991a7b3c882c\") " pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.308215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqx8\" (UniqueName: \"kubernetes.io/projected/c34e624f-99ae-4766-81df-991a7b3c882c-kube-api-access-kmqx8\") pod \"auto-csr-approver-29552668-zpf6c\" (UID: \"c34e624f-99ae-4766-81df-991a7b3c882c\") " pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.326178 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqx8\" (UniqueName: \"kubernetes.io/projected/c34e624f-99ae-4766-81df-991a7b3c882c-kube-api-access-kmqx8\") pod \"auto-csr-approver-29552668-zpf6c\" (UID: \"c34e624f-99ae-4766-81df-991a7b3c882c\") " pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.473304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:00 crc kubenswrapper[4749]: I0310 16:28:00.888053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-zpf6c"] Mar 10 16:28:01 crc kubenswrapper[4749]: I0310 16:28:01.272250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" event={"ID":"c34e624f-99ae-4766-81df-991a7b3c882c","Type":"ContainerStarted","Data":"64a323f7594914ee2623410f06bf29b312389f53c30fdeacd36d7716af38670c"} Mar 10 16:28:03 crc kubenswrapper[4749]: I0310 16:28:03.289164 4749 generic.go:334] "Generic (PLEG): container finished" podID="c34e624f-99ae-4766-81df-991a7b3c882c" containerID="53dede1c82a95b1e02149e212868372ed8f8d76cef98d688991a55336f53f533" exitCode=0 Mar 10 16:28:03 crc kubenswrapper[4749]: I0310 16:28:03.289253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" event={"ID":"c34e624f-99ae-4766-81df-991a7b3c882c","Type":"ContainerDied","Data":"53dede1c82a95b1e02149e212868372ed8f8d76cef98d688991a55336f53f533"} Mar 10 16:28:04 crc kubenswrapper[4749]: I0310 16:28:04.642528 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:04 crc kubenswrapper[4749]: I0310 16:28:04.672604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmqx8\" (UniqueName: \"kubernetes.io/projected/c34e624f-99ae-4766-81df-991a7b3c882c-kube-api-access-kmqx8\") pod \"c34e624f-99ae-4766-81df-991a7b3c882c\" (UID: \"c34e624f-99ae-4766-81df-991a7b3c882c\") " Mar 10 16:28:04 crc kubenswrapper[4749]: I0310 16:28:04.682613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34e624f-99ae-4766-81df-991a7b3c882c-kube-api-access-kmqx8" (OuterVolumeSpecName: "kube-api-access-kmqx8") pod "c34e624f-99ae-4766-81df-991a7b3c882c" (UID: "c34e624f-99ae-4766-81df-991a7b3c882c"). InnerVolumeSpecName "kube-api-access-kmqx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:28:04 crc kubenswrapper[4749]: I0310 16:28:04.774684 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmqx8\" (UniqueName: \"kubernetes.io/projected/c34e624f-99ae-4766-81df-991a7b3c882c-kube-api-access-kmqx8\") on node \"crc\" DevicePath \"\"" Mar 10 16:28:05 crc kubenswrapper[4749]: I0310 16:28:05.310059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" event={"ID":"c34e624f-99ae-4766-81df-991a7b3c882c","Type":"ContainerDied","Data":"64a323f7594914ee2623410f06bf29b312389f53c30fdeacd36d7716af38670c"} Mar 10 16:28:05 crc kubenswrapper[4749]: I0310 16:28:05.310560 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64a323f7594914ee2623410f06bf29b312389f53c30fdeacd36d7716af38670c" Mar 10 16:28:05 crc kubenswrapper[4749]: I0310 16:28:05.310901 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552668-zpf6c" Mar 10 16:28:05 crc kubenswrapper[4749]: I0310 16:28:05.719241 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-6kqd2"] Mar 10 16:28:05 crc kubenswrapper[4749]: I0310 16:28:05.724511 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552662-6kqd2"] Mar 10 16:28:07 crc kubenswrapper[4749]: I0310 16:28:07.620846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56f3c78-8506-41fb-9385-33676fd1d66d" path="/var/lib/kubelet/pods/c56f3c78-8506-41fb-9385-33676fd1d66d/volumes" Mar 10 16:28:24 crc kubenswrapper[4749]: I0310 16:28:24.423390 4749 scope.go:117] "RemoveContainer" containerID="4102c36ba098f77a65a6ad3eb3d29ff1663705100d411957f4483ac9d392364c" Mar 10 16:28:50 crc kubenswrapper[4749]: I0310 16:28:50.980499 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:28:50 crc kubenswrapper[4749]: I0310 16:28:50.981009 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:29:20 crc kubenswrapper[4749]: I0310 16:29:20.980546 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:29:20 crc kubenswrapper[4749]: I0310 16:29:20.982610 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:29:50 crc kubenswrapper[4749]: I0310 16:29:50.980473 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:29:50 crc kubenswrapper[4749]: I0310 16:29:50.981089 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:29:50 crc kubenswrapper[4749]: I0310 16:29:50.981161 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:29:50 crc kubenswrapper[4749]: I0310 16:29:50.982192 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:29:50 crc kubenswrapper[4749]: I0310 16:29:50.982336 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" gracePeriod=600 Mar 10 16:29:51 crc kubenswrapper[4749]: E0310 16:29:51.125426 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:29:51 crc kubenswrapper[4749]: I0310 16:29:51.128571 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" exitCode=0 Mar 10 16:29:51 crc kubenswrapper[4749]: I0310 16:29:51.128610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a"} Mar 10 16:29:51 crc kubenswrapper[4749]: I0310 16:29:51.128640 4749 scope.go:117] "RemoveContainer" containerID="b7adaaac49c72dbb9804e4450407d6b2a442880a14ae61fa8bea5a508b2de8ea" Mar 10 16:29:52 crc kubenswrapper[4749]: I0310 16:29:52.137764 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:29:52 crc kubenswrapper[4749]: E0310 16:29:52.138101 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.155239 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552670-cspng"] Mar 10 16:30:00 crc kubenswrapper[4749]: E0310 16:30:00.156270 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34e624f-99ae-4766-81df-991a7b3c882c" containerName="oc" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.156289 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34e624f-99ae-4766-81df-991a7b3c882c" containerName="oc" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.156483 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34e624f-99ae-4766-81df-991a7b3c882c" containerName="oc" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.157072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.160301 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.160347 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.160895 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.164743 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c"] Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.165647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.167217 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.167396 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.173107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c"] Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.194825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-cspng"] Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.316656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62954212-e121-40d6-aa39-972fdb4f2873-config-volume\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.316731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7mk\" (UniqueName: \"kubernetes.io/projected/9d862895-f120-423b-b784-89631694662d-kube-api-access-kt7mk\") pod \"auto-csr-approver-29552670-cspng\" (UID: \"9d862895-f120-423b-b784-89631694662d\") " pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.316758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5j8b\" (UniqueName: \"kubernetes.io/projected/62954212-e121-40d6-aa39-972fdb4f2873-kube-api-access-p5j8b\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.316775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62954212-e121-40d6-aa39-972fdb4f2873-secret-volume\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.417911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62954212-e121-40d6-aa39-972fdb4f2873-config-volume\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.418276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7mk\" (UniqueName: \"kubernetes.io/projected/9d862895-f120-423b-b784-89631694662d-kube-api-access-kt7mk\") pod \"auto-csr-approver-29552670-cspng\" (UID: \"9d862895-f120-423b-b784-89631694662d\") " pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.418400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5j8b\" (UniqueName: \"kubernetes.io/projected/62954212-e121-40d6-aa39-972fdb4f2873-kube-api-access-p5j8b\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.418503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62954212-e121-40d6-aa39-972fdb4f2873-secret-volume\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.419591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62954212-e121-40d6-aa39-972fdb4f2873-config-volume\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.429494 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62954212-e121-40d6-aa39-972fdb4f2873-secret-volume\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.438535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5j8b\" (UniqueName: \"kubernetes.io/projected/62954212-e121-40d6-aa39-972fdb4f2873-kube-api-access-p5j8b\") pod \"collect-profiles-29552670-9qs7c\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.438980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7mk\" (UniqueName: \"kubernetes.io/projected/9d862895-f120-423b-b784-89631694662d-kube-api-access-kt7mk\") pod \"auto-csr-approver-29552670-cspng\" (UID: \"9d862895-f120-423b-b784-89631694662d\") " pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.477864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.492714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.912019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-cspng"] Mar 10 16:30:00 crc kubenswrapper[4749]: I0310 16:30:00.960589 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c"] Mar 10 16:30:01 crc kubenswrapper[4749]: I0310 16:30:01.224748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552670-cspng" event={"ID":"9d862895-f120-423b-b784-89631694662d","Type":"ContainerStarted","Data":"f8de9cd13c7b9eb57b69bc13e4f18dad368838947ec1507661f394a08de270fb"} Mar 10 16:30:01 crc kubenswrapper[4749]: I0310 16:30:01.228945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" event={"ID":"62954212-e121-40d6-aa39-972fdb4f2873","Type":"ContainerStarted","Data":"80e8d4c67c173dbcbf27ff2986cc013adfe74daa53227fa451f438c349b95ddc"} Mar 10 16:30:01 crc kubenswrapper[4749]: I0310 16:30:01.228992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" event={"ID":"62954212-e121-40d6-aa39-972fdb4f2873","Type":"ContainerStarted","Data":"3301c8887ce8b8e25ce7df87284a257343e20eca81de1e4486c49a92a5ff398a"} Mar 10 16:30:01 crc kubenswrapper[4749]: I0310 16:30:01.254317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" podStartSLOduration=1.2543022289999999 podStartE2EDuration="1.254302229s" podCreationTimestamp="2026-03-10 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:30:01.248323767 +0000 UTC m=+2498.370189464" watchObservedRunningTime="2026-03-10 16:30:01.254302229 +0000 UTC m=+2498.376167916" Mar 10 16:30:02 crc kubenswrapper[4749]: I0310 16:30:02.238527 4749 generic.go:334] "Generic (PLEG): container finished" podID="62954212-e121-40d6-aa39-972fdb4f2873" containerID="80e8d4c67c173dbcbf27ff2986cc013adfe74daa53227fa451f438c349b95ddc" exitCode=0 Mar 10 16:30:02 crc kubenswrapper[4749]: I0310 16:30:02.238620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" event={"ID":"62954212-e121-40d6-aa39-972fdb4f2873","Type":"ContainerDied","Data":"80e8d4c67c173dbcbf27ff2986cc013adfe74daa53227fa451f438c349b95ddc"} Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.252801 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d862895-f120-423b-b784-89631694662d" containerID="8af020c6f3346045f1de6c473bbc9aeb04daeda8eda0c6ea5879e37393db7403" exitCode=0 Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.252980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552670-cspng" event={"ID":"9d862895-f120-423b-b784-89631694662d","Type":"ContainerDied","Data":"8af020c6f3346045f1de6c473bbc9aeb04daeda8eda0c6ea5879e37393db7403"} Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.492125 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.569045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62954212-e121-40d6-aa39-972fdb4f2873-secret-volume\") pod \"62954212-e121-40d6-aa39-972fdb4f2873\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.569134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5j8b\" (UniqueName: \"kubernetes.io/projected/62954212-e121-40d6-aa39-972fdb4f2873-kube-api-access-p5j8b\") pod \"62954212-e121-40d6-aa39-972fdb4f2873\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.569351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62954212-e121-40d6-aa39-972fdb4f2873-config-volume\") pod \"62954212-e121-40d6-aa39-972fdb4f2873\" (UID: \"62954212-e121-40d6-aa39-972fdb4f2873\") " Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.570665 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62954212-e121-40d6-aa39-972fdb4f2873-config-volume" (OuterVolumeSpecName: "config-volume") pod "62954212-e121-40d6-aa39-972fdb4f2873" (UID: "62954212-e121-40d6-aa39-972fdb4f2873"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.575310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62954212-e121-40d6-aa39-972fdb4f2873-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "62954212-e121-40d6-aa39-972fdb4f2873" (UID: "62954212-e121-40d6-aa39-972fdb4f2873"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.577909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62954212-e121-40d6-aa39-972fdb4f2873-kube-api-access-p5j8b" (OuterVolumeSpecName: "kube-api-access-p5j8b") pod "62954212-e121-40d6-aa39-972fdb4f2873" (UID: "62954212-e121-40d6-aa39-972fdb4f2873"). InnerVolumeSpecName "kube-api-access-p5j8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.670565 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62954212-e121-40d6-aa39-972fdb4f2873-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.670603 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5j8b\" (UniqueName: \"kubernetes.io/projected/62954212-e121-40d6-aa39-972fdb4f2873-kube-api-access-p5j8b\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:03 crc kubenswrapper[4749]: I0310 16:30:03.670614 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62954212-e121-40d6-aa39-972fdb4f2873-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.265424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" event={"ID":"62954212-e121-40d6-aa39-972fdb4f2873","Type":"ContainerDied","Data":"3301c8887ce8b8e25ce7df87284a257343e20eca81de1e4486c49a92a5ff398a"} Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.265471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c" Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.265481 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3301c8887ce8b8e25ce7df87284a257343e20eca81de1e4486c49a92a5ff398a" Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.334926 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr"] Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.343279 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-97htr"] Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.550272 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.682458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7mk\" (UniqueName: \"kubernetes.io/projected/9d862895-f120-423b-b784-89631694662d-kube-api-access-kt7mk\") pod \"9d862895-f120-423b-b784-89631694662d\" (UID: \"9d862895-f120-423b-b784-89631694662d\") " Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.688587 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d862895-f120-423b-b784-89631694662d-kube-api-access-kt7mk" (OuterVolumeSpecName: "kube-api-access-kt7mk") pod "9d862895-f120-423b-b784-89631694662d" (UID: "9d862895-f120-423b-b784-89631694662d"). InnerVolumeSpecName "kube-api-access-kt7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:30:04 crc kubenswrapper[4749]: I0310 16:30:04.784197 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7mk\" (UniqueName: \"kubernetes.io/projected/9d862895-f120-423b-b784-89631694662d-kube-api-access-kt7mk\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:05 crc kubenswrapper[4749]: I0310 16:30:05.279916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552670-cspng" event={"ID":"9d862895-f120-423b-b784-89631694662d","Type":"ContainerDied","Data":"f8de9cd13c7b9eb57b69bc13e4f18dad368838947ec1507661f394a08de270fb"} Mar 10 16:30:05 crc kubenswrapper[4749]: I0310 16:30:05.279966 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8de9cd13c7b9eb57b69bc13e4f18dad368838947ec1507661f394a08de270fb" Mar 10 16:30:05 crc kubenswrapper[4749]: I0310 16:30:05.280036 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552670-cspng" Mar 10 16:30:05 crc kubenswrapper[4749]: I0310 16:30:05.619254 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b009296-7e7c-4e1b-bec2-24cf75849218" path="/var/lib/kubelet/pods/3b009296-7e7c-4e1b-bec2-24cf75849218/volumes" Mar 10 16:30:05 crc kubenswrapper[4749]: I0310 16:30:05.620732 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-hprkd"] Mar 10 16:30:05 crc kubenswrapper[4749]: I0310 16:30:05.628048 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552664-hprkd"] Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.306119 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6wzkh"] Mar 10 16:30:07 crc kubenswrapper[4749]: E0310 16:30:07.306628 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62954212-e121-40d6-aa39-972fdb4f2873" containerName="collect-profiles" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.306640 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="62954212-e121-40d6-aa39-972fdb4f2873" containerName="collect-profiles" Mar 10 16:30:07 crc kubenswrapper[4749]: E0310 16:30:07.306663 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d862895-f120-423b-b784-89631694662d" containerName="oc" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.306670 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d862895-f120-423b-b784-89631694662d" containerName="oc" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.306801 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="62954212-e121-40d6-aa39-972fdb4f2873" containerName="collect-profiles" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.306818 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d862895-f120-423b-b784-89631694662d" containerName="oc" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.307697 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.323484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wzkh"] Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.427044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5t78\" (UniqueName: \"kubernetes.io/projected/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-kube-api-access-n5t78\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.427362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-utilities\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.427551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-catalog-content\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.529138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5t78\" (UniqueName: \"kubernetes.io/projected/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-kube-api-access-n5t78\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.529730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-utilities\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.530270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-catalog-content\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.530214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-utilities\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.530608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-catalog-content\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.547569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5t78\" (UniqueName: \"kubernetes.io/projected/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-kube-api-access-n5t78\") pod \"community-operators-6wzkh\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.606669 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:30:07 crc kubenswrapper[4749]: E0310 16:30:07.606896 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.619201 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e83e07-cfae-4232-8fcb-945eca1c4425" path="/var/lib/kubelet/pods/34e83e07-cfae-4232-8fcb-945eca1c4425/volumes" Mar 10 16:30:07 crc kubenswrapper[4749]: I0310 16:30:07.639126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:08 crc kubenswrapper[4749]: I0310 16:30:08.120822 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wzkh"] Mar 10 16:30:08 crc kubenswrapper[4749]: W0310 16:30:08.127003 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85310e33_bf20_4dd1_a8bc_1e8a28ad2da8.slice/crio-fbf3dc3702220dba1bf228964e17bf13538b816bb29ba4ad57f4854158f04890 WatchSource:0}: Error finding container fbf3dc3702220dba1bf228964e17bf13538b816bb29ba4ad57f4854158f04890: Status 404 returned error can't find the container with id fbf3dc3702220dba1bf228964e17bf13538b816bb29ba4ad57f4854158f04890 Mar 10 16:30:08 crc kubenswrapper[4749]: I0310 16:30:08.310568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerStarted","Data":"49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f"} Mar 10 16:30:08 crc kubenswrapper[4749]: I0310 16:30:08.312344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerStarted","Data":"fbf3dc3702220dba1bf228964e17bf13538b816bb29ba4ad57f4854158f04890"} Mar 10 16:30:09 crc kubenswrapper[4749]: I0310 16:30:09.321457 4749 generic.go:334] "Generic (PLEG): container finished" podID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerID="49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f" exitCode=0 Mar 10 16:30:09 crc kubenswrapper[4749]: I0310 16:30:09.322309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerDied","Data":"49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f"} Mar 10 16:30:10 crc kubenswrapper[4749]: I0310 16:30:10.329829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerStarted","Data":"575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d"} Mar 10 16:30:11 crc kubenswrapper[4749]: I0310 16:30:11.340731 4749 generic.go:334] "Generic (PLEG): container finished" podID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerID="575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d" exitCode=0 Mar 10 16:30:11 crc kubenswrapper[4749]: I0310 16:30:11.340786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerDied","Data":"575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d"} Mar 10 16:30:12 crc kubenswrapper[4749]: I0310 16:30:12.349878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerStarted","Data":"0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f"} Mar 10 16:30:12 crc kubenswrapper[4749]: I0310 16:30:12.372473 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6wzkh" podStartSLOduration=2.711712738 podStartE2EDuration="5.372453869s" podCreationTimestamp="2026-03-10 16:30:07 +0000 UTC" firstStartedPulling="2026-03-10 16:30:09.324960445 +0000 UTC m=+2506.446826132" lastFinishedPulling="2026-03-10 16:30:11.985701546 +0000 UTC m=+2509.107567263" observedRunningTime="2026-03-10 16:30:12.367922527 +0000 UTC m=+2509.489788224" watchObservedRunningTime="2026-03-10 16:30:12.372453869 +0000 UTC m=+2509.494319566" Mar 10 16:30:17 crc kubenswrapper[4749]: I0310 16:30:17.639333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:17 crc kubenswrapper[4749]: I0310 16:30:17.640308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:17 crc kubenswrapper[4749]: I0310 16:30:17.729784 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:18 crc kubenswrapper[4749]: I0310 16:30:18.443019 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:18 crc kubenswrapper[4749]: I0310 16:30:18.495091 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6wzkh"] Mar 10 16:30:19 crc kubenswrapper[4749]: I0310 16:30:19.610798 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:30:19 crc kubenswrapper[4749]: E0310 16:30:19.613757 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.408742 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6wzkh" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="registry-server" containerID="cri-o://0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f" gracePeriod=2 Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.758474 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.846765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-catalog-content\") pod \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.847092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5t78\" (UniqueName: \"kubernetes.io/projected/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-kube-api-access-n5t78\") pod \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.847267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-utilities\") pod \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\" (UID: \"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8\") " Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.848319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-utilities" (OuterVolumeSpecName: "utilities") pod "85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" (UID: "85310e33-bf20-4dd1-a8bc-1e8a28ad2da8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.852669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-kube-api-access-n5t78" (OuterVolumeSpecName: "kube-api-access-n5t78") pod "85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" (UID: "85310e33-bf20-4dd1-a8bc-1e8a28ad2da8"). InnerVolumeSpecName "kube-api-access-n5t78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.948654 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:20 crc kubenswrapper[4749]: I0310 16:30:20.949009 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5t78\" (UniqueName: \"kubernetes.io/projected/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-kube-api-access-n5t78\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.420916 4749 generic.go:334] "Generic (PLEG): container finished" podID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerID="0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f" exitCode=0 Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.421018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerDied","Data":"0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f"} Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.421098 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wzkh" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.421790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wzkh" event={"ID":"85310e33-bf20-4dd1-a8bc-1e8a28ad2da8","Type":"ContainerDied","Data":"fbf3dc3702220dba1bf228964e17bf13538b816bb29ba4ad57f4854158f04890"} Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.421836 4749 scope.go:117] "RemoveContainer" containerID="0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.445730 4749 scope.go:117] "RemoveContainer" containerID="575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.478607 4749 scope.go:117] "RemoveContainer" containerID="49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.507193 4749 scope.go:117] "RemoveContainer" containerID="0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f" Mar 10 16:30:21 crc kubenswrapper[4749]: E0310 16:30:21.507922 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f\": container with ID starting with 0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f not found: ID does not exist" containerID="0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.507966 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f"} err="failed to get container status \"0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f\": rpc error: code = NotFound desc = could not find container \"0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f\": container with ID starting with 0e8142b0ec4100ee1982091b273bacdf9b7d282d43fc83d6cb6d1ee723a3d24f not found: ID does not exist" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.507996 4749 scope.go:117] "RemoveContainer" containerID="575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d" Mar 10 16:30:21 crc kubenswrapper[4749]: E0310 16:30:21.508507 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d\": container with ID starting with 575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d not found: ID does not exist" containerID="575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.508549 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d"} err="failed to get container status \"575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d\": rpc error: code = NotFound desc = could not find container \"575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d\": container with ID starting with 575f63a9680a763a2e40c4ac3ad1d6717f511109b3f1042de55993142264c47d not found: ID does not exist" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.508576 4749 scope.go:117] "RemoveContainer" containerID="49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f" Mar 10 16:30:21 crc kubenswrapper[4749]: E0310 16:30:21.509064 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f\": container with ID starting with 49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f not found: ID does not exist" containerID="49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.509125 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f"} err="failed to get container status \"49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f\": rpc error: code = NotFound desc = could not find container \"49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f\": container with ID starting with 49c77b9e55d41ee88efe5b91e6bebbb8a70ea6c5657a6a17320fed708feeb48f not found: ID does not exist" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.595914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" (UID: "85310e33-bf20-4dd1-a8bc-1e8a28ad2da8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.660040 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.745679 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6wzkh"] Mar 10 16:30:21 crc kubenswrapper[4749]: I0310 16:30:21.752877 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6wzkh"] Mar 10 16:30:23 crc kubenswrapper[4749]: I0310 16:30:23.618892 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" path="/var/lib/kubelet/pods/85310e33-bf20-4dd1-a8bc-1e8a28ad2da8/volumes" Mar 10 16:30:24 crc kubenswrapper[4749]: I0310 16:30:24.523952 4749 scope.go:117] "RemoveContainer" containerID="f5de8894560d11e1ad6786a4a95ca31c29d7a02cf4b613ae120a657fe64ab518" Mar 10 16:30:24 crc kubenswrapper[4749]: I0310 16:30:24.568871 4749 scope.go:117] "RemoveContainer" containerID="fca82aad26fe6fb51e6c3d005288e1c320c91f025525220def58af6067ac90f2" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.295088 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:30:26 crc kubenswrapper[4749]: E0310 16:30:26.295886 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="extract-utilities" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.295907 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="extract-utilities" Mar 10 16:30:26 crc kubenswrapper[4749]: E0310 16:30:26.295928 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="registry-server" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.295937 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="registry-server" Mar 10 16:30:26 crc kubenswrapper[4749]: E0310 16:30:26.295970 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="extract-content" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.295984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="extract-content" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.296180 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85310e33-bf20-4dd1-a8bc-1e8a28ad2da8" containerName="registry-server" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.297799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.307880 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.429911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-catalog-content\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.429969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9zz\" (UniqueName: \"kubernetes.io/projected/904b24b3-e0d3-452a-855c-3cfc2f78a152-kube-api-access-8r9zz\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.430018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-utilities\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.531437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-catalog-content\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.531487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9zz\" (UniqueName: \"kubernetes.io/projected/904b24b3-e0d3-452a-855c-3cfc2f78a152-kube-api-access-8r9zz\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.531537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-utilities\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.532013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-utilities\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.532265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-catalog-content\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.558522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9zz\" (UniqueName: \"kubernetes.io/projected/904b24b3-e0d3-452a-855c-3cfc2f78a152-kube-api-access-8r9zz\") pod \"redhat-operators-q5cvw\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:26 crc kubenswrapper[4749]: I0310 16:30:26.620669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:27 crc kubenswrapper[4749]: I0310 16:30:27.062800 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:30:27 crc kubenswrapper[4749]: I0310 16:30:27.474019 4749 generic.go:334] "Generic (PLEG): container finished" podID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerID="eead918b0c52354a9bdd0f6e1aea1e10631e5d6751e4c31d3e7783b00b26126e" exitCode=0 Mar 10 16:30:27 crc kubenswrapper[4749]: I0310 16:30:27.474105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerDied","Data":"eead918b0c52354a9bdd0f6e1aea1e10631e5d6751e4c31d3e7783b00b26126e"} Mar 10 16:30:27 crc kubenswrapper[4749]: I0310 16:30:27.475245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerStarted","Data":"aada8b15bffba48008e8e58b6cb48731a3c83bb599c9a22c425f1363e58b2845"} Mar 10 16:30:31 crc kubenswrapper[4749]: I0310 16:30:31.611882 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:30:31 crc kubenswrapper[4749]: E0310 16:30:31.612547 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:30:36 crc kubenswrapper[4749]: I0310 16:30:36.539620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerStarted","Data":"f988c6e1708cd842cf39a44fb18a84dfa26f998b9228002fd4d83112ae2803d3"} Mar 10 16:30:37 crc kubenswrapper[4749]: I0310 16:30:37.548123 4749 generic.go:334] "Generic (PLEG): container finished" podID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerID="f988c6e1708cd842cf39a44fb18a84dfa26f998b9228002fd4d83112ae2803d3" exitCode=0 Mar 10 16:30:37 crc kubenswrapper[4749]: I0310 16:30:37.548177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerDied","Data":"f988c6e1708cd842cf39a44fb18a84dfa26f998b9228002fd4d83112ae2803d3"} Mar 10 16:30:38 crc kubenswrapper[4749]: I0310 16:30:38.557670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerStarted","Data":"7111f9a04cb360503aa14adb4b460e3c8c75c67b6a681a6e43bfa174016cc567"} Mar 10 16:30:38 crc kubenswrapper[4749]: I0310 16:30:38.578347 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q5cvw" podStartSLOduration=1.79925343 podStartE2EDuration="12.578329268s" podCreationTimestamp="2026-03-10 16:30:26 +0000 UTC" firstStartedPulling="2026-03-10 16:30:27.475681036 +0000 UTC m=+2524.597546723" lastFinishedPulling="2026-03-10 16:30:38.254756864 +0000 UTC m=+2535.376622561" observedRunningTime="2026-03-10 16:30:38.574775151 +0000 UTC m=+2535.696640838" watchObservedRunningTime="2026-03-10 16:30:38.578329268 +0000 UTC m=+2535.700194945" Mar 10 16:30:46 crc kubenswrapper[4749]: I0310 16:30:46.606751 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:30:46 crc kubenswrapper[4749]: E0310 16:30:46.608086 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:30:46 crc kubenswrapper[4749]: I0310 16:30:46.620877 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:46 crc kubenswrapper[4749]: I0310 16:30:46.620925 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:46 crc kubenswrapper[4749]: I0310 16:30:46.687649 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:47 crc kubenswrapper[4749]: I0310 16:30:47.677809 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:30:47 crc kubenswrapper[4749]: I0310 16:30:47.761301 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:30:47 crc kubenswrapper[4749]: I0310 16:30:47.797264 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvr84"] Mar 10 16:30:47 crc kubenswrapper[4749]: I0310 16:30:47.797541 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xvr84" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="registry-server" containerID="cri-o://f0f59ae3230b9784583d124f922c23640d2fe5f0820db0c175ea8c0ad97e738f" gracePeriod=2 Mar 10 16:30:48 crc kubenswrapper[4749]: I0310 16:30:48.638948 4749 generic.go:334] "Generic (PLEG): container finished" podID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerID="f0f59ae3230b9784583d124f922c23640d2fe5f0820db0c175ea8c0ad97e738f" exitCode=0 Mar 10 16:30:48 crc kubenswrapper[4749]: I0310 16:30:48.639036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerDied","Data":"f0f59ae3230b9784583d124f922c23640d2fe5f0820db0c175ea8c0ad97e738f"} Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.468350 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.610717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-utilities\") pod \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.610767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj77c\" (UniqueName: \"kubernetes.io/projected/8294b189-cc7b-45fa-a350-d0fe5bd015ee-kube-api-access-kj77c\") pod \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.610829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-catalog-content\") pod \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\" (UID: \"8294b189-cc7b-45fa-a350-d0fe5bd015ee\") " Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.611341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-utilities" (OuterVolumeSpecName: "utilities") pod "8294b189-cc7b-45fa-a350-d0fe5bd015ee" (UID: "8294b189-cc7b-45fa-a350-d0fe5bd015ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.617004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8294b189-cc7b-45fa-a350-d0fe5bd015ee-kube-api-access-kj77c" (OuterVolumeSpecName: "kube-api-access-kj77c") pod "8294b189-cc7b-45fa-a350-d0fe5bd015ee" (UID: "8294b189-cc7b-45fa-a350-d0fe5bd015ee"). InnerVolumeSpecName "kube-api-access-kj77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.657338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvr84" event={"ID":"8294b189-cc7b-45fa-a350-d0fe5bd015ee","Type":"ContainerDied","Data":"923e63d3e5532bd8d24656a0f4d5dcb053869ed76171708c38bfbe67af46cbd0"} Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.657401 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvr84" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.657415 4749 scope.go:117] "RemoveContainer" containerID="f0f59ae3230b9784583d124f922c23640d2fe5f0820db0c175ea8c0ad97e738f" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.675558 4749 scope.go:117] "RemoveContainer" containerID="a0966eedb53ad9e89bc2e8ef010d6b1f0bcfeb04f1d0bc2771c7f5964851a535" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.694509 4749 scope.go:117] "RemoveContainer" containerID="8f55318d965c0610ff6024a3687cc64d10ea059e8e1e84fc652960e753ef5434" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.713062 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.713100 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj77c\" (UniqueName: \"kubernetes.io/projected/8294b189-cc7b-45fa-a350-d0fe5bd015ee-kube-api-access-kj77c\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.716684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8294b189-cc7b-45fa-a350-d0fe5bd015ee" (UID: "8294b189-cc7b-45fa-a350-d0fe5bd015ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.815065 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8294b189-cc7b-45fa-a350-d0fe5bd015ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:30:51 crc kubenswrapper[4749]: I0310 16:30:51.995116 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvr84"] Mar 10 16:30:52 crc kubenswrapper[4749]: I0310 16:30:52.006323 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xvr84"] Mar 10 16:30:53 crc kubenswrapper[4749]: I0310 16:30:53.625123 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" path="/var/lib/kubelet/pods/8294b189-cc7b-45fa-a350-d0fe5bd015ee/volumes" Mar 10 16:30:58 crc kubenswrapper[4749]: I0310 16:30:58.608288 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:30:58 crc kubenswrapper[4749]: E0310 16:30:58.609500 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:31:09 crc kubenswrapper[4749]: I0310 16:31:09.607827 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:31:09 crc kubenswrapper[4749]: E0310 16:31:09.609755 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:31:21 crc kubenswrapper[4749]: I0310 16:31:21.606460 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:31:21 crc kubenswrapper[4749]: E0310 16:31:21.606959 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:31:36 crc kubenswrapper[4749]: I0310 16:31:36.607030 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:31:36 crc kubenswrapper[4749]: E0310 16:31:36.608952 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:31:49 crc kubenswrapper[4749]: I0310 16:31:49.606819 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:31:49 crc kubenswrapper[4749]: E0310 16:31:49.607974 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.143998 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552672-lnz9l"] Mar 10 16:32:00 crc kubenswrapper[4749]: E0310 16:32:00.144857 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="extract-content" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.144872 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="extract-content" Mar 10 16:32:00 crc kubenswrapper[4749]: E0310 16:32:00.144892 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="registry-server" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.144899 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="registry-server" Mar 10 16:32:00 crc kubenswrapper[4749]: E0310 16:32:00.144909 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="extract-utilities" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.144918 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="extract-utilities" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.145075 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8294b189-cc7b-45fa-a350-d0fe5bd015ee" containerName="registry-server" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.145600 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.148541 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.148941 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.150653 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.159695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-lnz9l"] Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.254128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbttf\" (UniqueName: \"kubernetes.io/projected/c2078353-2d1d-42fa-ba1b-30613a1a4007-kube-api-access-wbttf\") pod \"auto-csr-approver-29552672-lnz9l\" (UID: \"c2078353-2d1d-42fa-ba1b-30613a1a4007\") " pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.356450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbttf\" (UniqueName: \"kubernetes.io/projected/c2078353-2d1d-42fa-ba1b-30613a1a4007-kube-api-access-wbttf\") pod \"auto-csr-approver-29552672-lnz9l\" (UID: \"c2078353-2d1d-42fa-ba1b-30613a1a4007\") " pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.383887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbttf\" (UniqueName: \"kubernetes.io/projected/c2078353-2d1d-42fa-ba1b-30613a1a4007-kube-api-access-wbttf\") pod \"auto-csr-approver-29552672-lnz9l\" (UID: \"c2078353-2d1d-42fa-ba1b-30613a1a4007\") " pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.465763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.923491 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-lnz9l"] Mar 10 16:32:00 crc kubenswrapper[4749]: I0310 16:32:00.928711 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:32:01 crc kubenswrapper[4749]: I0310 16:32:01.208348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" event={"ID":"c2078353-2d1d-42fa-ba1b-30613a1a4007","Type":"ContainerStarted","Data":"b6d664d1a1918037fe355e66c234e5d68edef5827101cecac62e5d977a45b59b"} Mar 10 16:32:03 crc kubenswrapper[4749]: I0310 16:32:03.226589 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2078353-2d1d-42fa-ba1b-30613a1a4007" containerID="e0956bdfa5a4dae1084c827e13b622fbf4334b0ae3e06ef0b3b842fb6b079f7d" exitCode=0 Mar 10 16:32:03 crc kubenswrapper[4749]: I0310 16:32:03.226700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" event={"ID":"c2078353-2d1d-42fa-ba1b-30613a1a4007","Type":"ContainerDied","Data":"e0956bdfa5a4dae1084c827e13b622fbf4334b0ae3e06ef0b3b842fb6b079f7d"} Mar 10 16:32:04 crc kubenswrapper[4749]: I0310 16:32:04.529433 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:04 crc kubenswrapper[4749]: I0310 16:32:04.607441 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:32:04 crc kubenswrapper[4749]: E0310 16:32:04.607730 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:32:04 crc kubenswrapper[4749]: I0310 16:32:04.717613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbttf\" (UniqueName: \"kubernetes.io/projected/c2078353-2d1d-42fa-ba1b-30613a1a4007-kube-api-access-wbttf\") pod \"c2078353-2d1d-42fa-ba1b-30613a1a4007\" (UID: \"c2078353-2d1d-42fa-ba1b-30613a1a4007\") " Mar 10 16:32:04 crc kubenswrapper[4749]: I0310 16:32:04.723684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2078353-2d1d-42fa-ba1b-30613a1a4007-kube-api-access-wbttf" (OuterVolumeSpecName: "kube-api-access-wbttf") pod "c2078353-2d1d-42fa-ba1b-30613a1a4007" (UID: "c2078353-2d1d-42fa-ba1b-30613a1a4007"). InnerVolumeSpecName "kube-api-access-wbttf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:32:04 crc kubenswrapper[4749]: I0310 16:32:04.819590 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbttf\" (UniqueName: \"kubernetes.io/projected/c2078353-2d1d-42fa-ba1b-30613a1a4007-kube-api-access-wbttf\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:05 crc kubenswrapper[4749]: I0310 16:32:05.252134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" event={"ID":"c2078353-2d1d-42fa-ba1b-30613a1a4007","Type":"ContainerDied","Data":"b6d664d1a1918037fe355e66c234e5d68edef5827101cecac62e5d977a45b59b"} Mar 10 16:32:05 crc kubenswrapper[4749]: I0310 16:32:05.252183 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6d664d1a1918037fe355e66c234e5d68edef5827101cecac62e5d977a45b59b" Mar 10 16:32:05 crc kubenswrapper[4749]: I0310 16:32:05.252239 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552672-lnz9l" Mar 10 16:32:05 crc kubenswrapper[4749]: I0310 16:32:05.601527 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-skz2g"] Mar 10 16:32:05 crc kubenswrapper[4749]: I0310 16:32:05.617914 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552666-skz2g"] Mar 10 16:32:07 crc kubenswrapper[4749]: I0310 16:32:07.618408 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6d767a-41c8-4cbf-a205-e89b2fadd947" path="/var/lib/kubelet/pods/2c6d767a-41c8-4cbf-a205-e89b2fadd947/volumes" Mar 10 16:32:15 crc kubenswrapper[4749]: I0310 16:32:15.607767 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:32:15 crc kubenswrapper[4749]: E0310 16:32:15.608874 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.816683 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7g728"] Mar 10 16:32:23 crc kubenswrapper[4749]: E0310 16:32:23.818820 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2078353-2d1d-42fa-ba1b-30613a1a4007" containerName="oc" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.818860 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2078353-2d1d-42fa-ba1b-30613a1a4007" containerName="oc" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.819011 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2078353-2d1d-42fa-ba1b-30613a1a4007" containerName="oc" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.820023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.829830 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7g728"] Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.862629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-catalog-content\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.862673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlppx\" (UniqueName: \"kubernetes.io/projected/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-kube-api-access-jlppx\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.862761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-utilities\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.964285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-utilities\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.964658 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-catalog-content\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.964793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlppx\" (UniqueName: \"kubernetes.io/projected/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-kube-api-access-jlppx\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.964841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-utilities\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.965134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-catalog-content\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:23 crc kubenswrapper[4749]: I0310 16:32:23.991281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlppx\" (UniqueName: \"kubernetes.io/projected/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-kube-api-access-jlppx\") pod \"certified-operators-7g728\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:24 crc kubenswrapper[4749]: I0310 16:32:24.136537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:24 crc kubenswrapper[4749]: I0310 16:32:24.631458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7g728"] Mar 10 16:32:24 crc kubenswrapper[4749]: I0310 16:32:24.690915 4749 scope.go:117] "RemoveContainer" containerID="3d2f5b5a5f5c50d3e7437c02bab723d3c5cf8e4338e344594f00b1577c0b0121" Mar 10 16:32:25 crc kubenswrapper[4749]: I0310 16:32:25.416283 4749 generic.go:334] "Generic (PLEG): container finished" podID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerID="c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f" exitCode=0 Mar 10 16:32:25 crc kubenswrapper[4749]: I0310 16:32:25.416334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g728" event={"ID":"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a","Type":"ContainerDied","Data":"c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f"} Mar 10 16:32:25 crc kubenswrapper[4749]: I0310 16:32:25.416652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g728" event={"ID":"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a","Type":"ContainerStarted","Data":"f109bd4b35e9a7c99cd78b966e19766bf6b6f9540917710d08e2ac07918b310e"} Mar 10 16:32:27 crc kubenswrapper[4749]: I0310 16:32:27.431895 4749 generic.go:334] "Generic (PLEG): container finished" podID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerID="4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24" exitCode=0 Mar 10 16:32:27 crc kubenswrapper[4749]: I0310 16:32:27.431980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g728" event={"ID":"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a","Type":"ContainerDied","Data":"4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24"} Mar 10 16:32:28 crc kubenswrapper[4749]: I0310 16:32:28.442564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g728" event={"ID":"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a","Type":"ContainerStarted","Data":"b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590"} Mar 10 16:32:28 crc kubenswrapper[4749]: I0310 16:32:28.462631 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7g728" podStartSLOduration=2.901565826 podStartE2EDuration="5.462610411s" podCreationTimestamp="2026-03-10 16:32:23 +0000 UTC" firstStartedPulling="2026-03-10 16:32:25.419576707 +0000 UTC m=+2642.541442394" lastFinishedPulling="2026-03-10 16:32:27.980621292 +0000 UTC m=+2645.102486979" observedRunningTime="2026-03-10 16:32:28.458222812 +0000 UTC m=+2645.580088509" watchObservedRunningTime="2026-03-10 16:32:28.462610411 +0000 UTC m=+2645.584476098" Mar 10 16:32:29 crc kubenswrapper[4749]: I0310 16:32:29.607840 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:32:29 crc kubenswrapper[4749]: E0310 16:32:29.608245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:32:34 crc kubenswrapper[4749]: I0310 16:32:34.137418 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:34 crc kubenswrapper[4749]: I0310 16:32:34.138005 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:34 crc kubenswrapper[4749]: I0310 16:32:34.199189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:34 crc kubenswrapper[4749]: I0310 16:32:34.527288 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:34 crc kubenswrapper[4749]: I0310 16:32:34.807290 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7g728"] Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.500408 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7g728" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="registry-server" containerID="cri-o://b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590" gracePeriod=2 Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.883484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.951020 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-utilities\") pod \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.951100 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlppx\" (UniqueName: \"kubernetes.io/projected/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-kube-api-access-jlppx\") pod \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.951183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-catalog-content\") pod \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\" (UID: \"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a\") " Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.952246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-utilities" (OuterVolumeSpecName: "utilities") pod "b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" (UID: "b55e8a46-89f4-4e12-b5a5-1a00eeecde6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:32:36 crc kubenswrapper[4749]: I0310 16:32:36.958001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-kube-api-access-jlppx" (OuterVolumeSpecName: "kube-api-access-jlppx") pod "b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" (UID: "b55e8a46-89f4-4e12-b5a5-1a00eeecde6a"). InnerVolumeSpecName "kube-api-access-jlppx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.019111 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" (UID: "b55e8a46-89f4-4e12-b5a5-1a00eeecde6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.052985 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.053032 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlppx\" (UniqueName: \"kubernetes.io/projected/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-kube-api-access-jlppx\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.053051 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.509338 4749 generic.go:334] "Generic (PLEG): container finished" podID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerID="b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590" exitCode=0 Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.509436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g728" event={"ID":"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a","Type":"ContainerDied","Data":"b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590"} Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.509483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g728" event={"ID":"b55e8a46-89f4-4e12-b5a5-1a00eeecde6a","Type":"ContainerDied","Data":"f109bd4b35e9a7c99cd78b966e19766bf6b6f9540917710d08e2ac07918b310e"} Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.509510 4749 scope.go:117] "RemoveContainer" containerID="b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.509437 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g728" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.528549 4749 scope.go:117] "RemoveContainer" containerID="4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.551927 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7g728"] Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.562468 4749 scope.go:117] "RemoveContainer" containerID="c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.564398 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7g728"] Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.586877 4749 scope.go:117] "RemoveContainer" containerID="b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590" Mar 10 16:32:37 crc kubenswrapper[4749]: E0310 16:32:37.587226 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590\": container with ID starting with b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590 not found: ID does not exist" containerID="b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.587253 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590"} err="failed to get container status \"b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590\": rpc error: code = NotFound desc = could not find container \"b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590\": container with ID starting with b615fdb0c44ecb0af6d20c1b2f2ee27b78bf3439bf1ad95959bd9d1728afe590 not found: ID does not exist" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.587274 4749 scope.go:117] "RemoveContainer" containerID="4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24" Mar 10 16:32:37 crc kubenswrapper[4749]: E0310 16:32:37.587462 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24\": container with ID starting with 4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24 not found: ID does not exist" containerID="4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.587479 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24"} err="failed to get container status \"4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24\": rpc error: code = NotFound desc = could not find container \"4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24\": container with ID starting with 4a2e10124056a8ce3e6240492d315113c87b90eeea9822ffeb68b2788aa3df24 not found: ID does not exist" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.587493 4749 scope.go:117] "RemoveContainer" containerID="c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f" Mar 10 16:32:37 crc kubenswrapper[4749]: E0310 16:32:37.587636 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f\": container with ID starting with c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f not found: ID does not exist" containerID="c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.587651 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f"} err="failed to get container status \"c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f\": rpc error: code = NotFound desc = could not find container \"c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f\": container with ID starting with c14e535367255fd9a363f73caffbce888c51019a9eeeb6ae72b16166e5573c6f not found: ID does not exist" Mar 10 16:32:37 crc kubenswrapper[4749]: I0310 16:32:37.615876 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" path="/var/lib/kubelet/pods/b55e8a46-89f4-4e12-b5a5-1a00eeecde6a/volumes" Mar 10 16:32:44 crc kubenswrapper[4749]: I0310 16:32:44.607775 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:32:44 crc kubenswrapper[4749]: E0310 16:32:44.608536 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:32:56 crc kubenswrapper[4749]: I0310 16:32:56.606978 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:32:56 crc kubenswrapper[4749]: E0310 16:32:56.608233 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:33:08 crc kubenswrapper[4749]: I0310 16:33:08.606691 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:33:08 crc kubenswrapper[4749]: E0310 16:33:08.607720 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:33:22 crc kubenswrapper[4749]: I0310 16:33:22.607024 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:33:22 crc kubenswrapper[4749]: E0310 16:33:22.607912 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:33:34 crc kubenswrapper[4749]: I0310 16:33:34.607158 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:33:34 crc kubenswrapper[4749]: E0310 16:33:34.608082 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:33:49 crc kubenswrapper[4749]: I0310 16:33:49.606427 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:33:49 crc kubenswrapper[4749]: E0310 16:33:49.607796 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.155524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552674-gfkpt"] Mar 10 16:34:00 crc kubenswrapper[4749]: E0310 16:34:00.156175 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="extract-content" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.156186 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="extract-content" Mar 10 16:34:00 crc kubenswrapper[4749]: E0310 16:34:00.156207 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="extract-utilities" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.156214 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="extract-utilities" Mar 10 16:34:00 crc kubenswrapper[4749]: E0310 16:34:00.156224 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="registry-server" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.156230 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="registry-server" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.156343 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55e8a46-89f4-4e12-b5a5-1a00eeecde6a" containerName="registry-server" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.156766 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.159175 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.159309 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.159763 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.174664 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552674-gfkpt"] Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.224701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z6ds\" (UniqueName: \"kubernetes.io/projected/df4925b1-7b8c-470b-96b5-c88613db9212-kube-api-access-6z6ds\") pod \"auto-csr-approver-29552674-gfkpt\" (UID: \"df4925b1-7b8c-470b-96b5-c88613db9212\") " pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.325761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z6ds\" (UniqueName: \"kubernetes.io/projected/df4925b1-7b8c-470b-96b5-c88613db9212-kube-api-access-6z6ds\") pod \"auto-csr-approver-29552674-gfkpt\" (UID: \"df4925b1-7b8c-470b-96b5-c88613db9212\") " pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.354068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z6ds\" (UniqueName: \"kubernetes.io/projected/df4925b1-7b8c-470b-96b5-c88613db9212-kube-api-access-6z6ds\") pod \"auto-csr-approver-29552674-gfkpt\" (UID: \"df4925b1-7b8c-470b-96b5-c88613db9212\") " pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.484487 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:00 crc kubenswrapper[4749]: I0310 16:34:00.734004 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552674-gfkpt"] Mar 10 16:34:01 crc kubenswrapper[4749]: I0310 16:34:01.219350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" event={"ID":"df4925b1-7b8c-470b-96b5-c88613db9212","Type":"ContainerStarted","Data":"a814c1bfdb6c7a1d59de69b146a39a1d9bc04aef556e2e96f68c1c6ec4e6ad9c"} Mar 10 16:34:02 crc kubenswrapper[4749]: I0310 16:34:02.226980 4749 generic.go:334] "Generic (PLEG): container finished" podID="df4925b1-7b8c-470b-96b5-c88613db9212" containerID="fad3ba586d6293eb06ff5de002ea67771ca69436c648f0ca2fc617c4c71fe652" exitCode=0 Mar 10 16:34:02 crc kubenswrapper[4749]: I0310 16:34:02.227020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" event={"ID":"df4925b1-7b8c-470b-96b5-c88613db9212","Type":"ContainerDied","Data":"fad3ba586d6293eb06ff5de002ea67771ca69436c648f0ca2fc617c4c71fe652"} Mar 10 16:34:02 crc kubenswrapper[4749]: I0310 16:34:02.606441 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:34:02 crc kubenswrapper[4749]: E0310 16:34:02.606956 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:34:03 crc kubenswrapper[4749]: I0310 16:34:03.522157 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:03 crc kubenswrapper[4749]: I0310 16:34:03.601793 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z6ds\" (UniqueName: \"kubernetes.io/projected/df4925b1-7b8c-470b-96b5-c88613db9212-kube-api-access-6z6ds\") pod \"df4925b1-7b8c-470b-96b5-c88613db9212\" (UID: \"df4925b1-7b8c-470b-96b5-c88613db9212\") " Mar 10 16:34:03 crc kubenswrapper[4749]: I0310 16:34:03.608205 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4925b1-7b8c-470b-96b5-c88613db9212-kube-api-access-6z6ds" (OuterVolumeSpecName: "kube-api-access-6z6ds") pod "df4925b1-7b8c-470b-96b5-c88613db9212" (UID: "df4925b1-7b8c-470b-96b5-c88613db9212"). InnerVolumeSpecName "kube-api-access-6z6ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:34:03 crc kubenswrapper[4749]: I0310 16:34:03.703634 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z6ds\" (UniqueName: \"kubernetes.io/projected/df4925b1-7b8c-470b-96b5-c88613db9212-kube-api-access-6z6ds\") on node \"crc\" DevicePath \"\"" Mar 10 16:34:04 crc kubenswrapper[4749]: I0310 16:34:04.245459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" event={"ID":"df4925b1-7b8c-470b-96b5-c88613db9212","Type":"ContainerDied","Data":"a814c1bfdb6c7a1d59de69b146a39a1d9bc04aef556e2e96f68c1c6ec4e6ad9c"} Mar 10 16:34:04 crc kubenswrapper[4749]: I0310 16:34:04.245499 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a814c1bfdb6c7a1d59de69b146a39a1d9bc04aef556e2e96f68c1c6ec4e6ad9c" Mar 10 16:34:04 crc kubenswrapper[4749]: I0310 16:34:04.245503 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552674-gfkpt" Mar 10 16:34:04 crc kubenswrapper[4749]: I0310 16:34:04.600884 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-zpf6c"] Mar 10 16:34:04 crc kubenswrapper[4749]: I0310 16:34:04.605905 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552668-zpf6c"] Mar 10 16:34:05 crc kubenswrapper[4749]: I0310 16:34:05.621093 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34e624f-99ae-4766-81df-991a7b3c882c" path="/var/lib/kubelet/pods/c34e624f-99ae-4766-81df-991a7b3c882c/volumes" Mar 10 16:34:13 crc kubenswrapper[4749]: I0310 16:34:13.610110 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:34:13 crc kubenswrapper[4749]: E0310 16:34:13.610928 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:34:24 crc kubenswrapper[4749]: I0310 16:34:24.606916 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:34:24 crc kubenswrapper[4749]: E0310 16:34:24.607782 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:34:24 crc kubenswrapper[4749]: I0310 16:34:24.792511 4749 scope.go:117] "RemoveContainer" containerID="53dede1c82a95b1e02149e212868372ed8f8d76cef98d688991a55336f53f533" Mar 10 16:34:38 crc kubenswrapper[4749]: I0310 16:34:38.607910 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:34:38 crc kubenswrapper[4749]: E0310 16:34:38.609055 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:34:51 crc kubenswrapper[4749]: I0310 16:34:51.607196 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:34:52 crc kubenswrapper[4749]: I0310 16:34:52.595572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"947b4a95f3f3dad04ce41601e5ebbfe62834f04cf5e2c3a475ba5da266ed6956"} Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.154283 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552676-9gtl7"] Mar 10 16:36:00 crc kubenswrapper[4749]: E0310 16:36:00.155174 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4925b1-7b8c-470b-96b5-c88613db9212" containerName="oc" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.155213 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4925b1-7b8c-470b-96b5-c88613db9212" containerName="oc" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.155642 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4925b1-7b8c-470b-96b5-c88613db9212" containerName="oc" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.156566 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.163241 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.164409 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.164476 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.168967 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552676-9gtl7"] Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.299273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fct2\" (UniqueName: \"kubernetes.io/projected/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e-kube-api-access-8fct2\") pod \"auto-csr-approver-29552676-9gtl7\" (UID: \"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e\") " pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.401244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fct2\" (UniqueName: \"kubernetes.io/projected/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e-kube-api-access-8fct2\") pod \"auto-csr-approver-29552676-9gtl7\" (UID: \"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e\") " pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.435966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fct2\" (UniqueName: \"kubernetes.io/projected/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e-kube-api-access-8fct2\") pod \"auto-csr-approver-29552676-9gtl7\" (UID: \"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e\") " pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.481542 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:00 crc kubenswrapper[4749]: I0310 16:36:00.931165 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552676-9gtl7"] Mar 10 16:36:01 crc kubenswrapper[4749]: I0310 16:36:01.169208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" event={"ID":"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e","Type":"ContainerStarted","Data":"cf1ee946ab5bfdd2c0199ae143846462d2b71974c2a3ade8c05194c61ca51e83"} Mar 10 16:36:03 crc kubenswrapper[4749]: I0310 16:36:03.190581 4749 generic.go:334] "Generic (PLEG): container finished" podID="2c3c3915-48bd-43ae-a1da-e74e53b1ec0e" containerID="daad1e35937d991d7845f0e8f65eaa4c64be90b5526f5fe3b9c916ba9325da12" exitCode=0 Mar 10 16:36:03 crc kubenswrapper[4749]: I0310 16:36:03.190672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" event={"ID":"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e","Type":"ContainerDied","Data":"daad1e35937d991d7845f0e8f65eaa4c64be90b5526f5fe3b9c916ba9325da12"} Mar 10 16:36:04 crc kubenswrapper[4749]: I0310 16:36:04.555266 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:04 crc kubenswrapper[4749]: I0310 16:36:04.673917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fct2\" (UniqueName: \"kubernetes.io/projected/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e-kube-api-access-8fct2\") pod \"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e\" (UID: \"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e\") " Mar 10 16:36:04 crc kubenswrapper[4749]: I0310 16:36:04.682394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e-kube-api-access-8fct2" (OuterVolumeSpecName: "kube-api-access-8fct2") pod "2c3c3915-48bd-43ae-a1da-e74e53b1ec0e" (UID: "2c3c3915-48bd-43ae-a1da-e74e53b1ec0e"). InnerVolumeSpecName "kube-api-access-8fct2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:36:04 crc kubenswrapper[4749]: I0310 16:36:04.775175 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fct2\" (UniqueName: \"kubernetes.io/projected/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e-kube-api-access-8fct2\") on node \"crc\" DevicePath \"\"" Mar 10 16:36:05 crc kubenswrapper[4749]: I0310 16:36:05.208608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" event={"ID":"2c3c3915-48bd-43ae-a1da-e74e53b1ec0e","Type":"ContainerDied","Data":"cf1ee946ab5bfdd2c0199ae143846462d2b71974c2a3ade8c05194c61ca51e83"} Mar 10 16:36:05 crc kubenswrapper[4749]: I0310 16:36:05.208655 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1ee946ab5bfdd2c0199ae143846462d2b71974c2a3ade8c05194c61ca51e83" Mar 10 16:36:05 crc kubenswrapper[4749]: I0310 16:36:05.208714 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552676-9gtl7" Mar 10 16:36:05 crc kubenswrapper[4749]: I0310 16:36:05.638584 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-cspng"] Mar 10 16:36:05 crc kubenswrapper[4749]: I0310 16:36:05.645840 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552670-cspng"] Mar 10 16:36:07 crc kubenswrapper[4749]: I0310 16:36:07.620092 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d862895-f120-423b-b784-89631694662d" path="/var/lib/kubelet/pods/9d862895-f120-423b-b784-89631694662d/volumes" Mar 10 16:36:24 crc kubenswrapper[4749]: I0310 16:36:24.882528 4749 scope.go:117] "RemoveContainer" containerID="8af020c6f3346045f1de6c473bbc9aeb04daeda8eda0c6ea5879e37393db7403" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.432481 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-plsjq"] Mar 10 16:37:17 crc kubenswrapper[4749]: E0310 16:37:17.433202 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c3915-48bd-43ae-a1da-e74e53b1ec0e" containerName="oc" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.433215 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c3915-48bd-43ae-a1da-e74e53b1ec0e" containerName="oc" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.433385 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3c3915-48bd-43ae-a1da-e74e53b1ec0e" containerName="oc" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.434423 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.445737 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plsjq"] Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.588479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-utilities\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.588555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcmf9\" (UniqueName: \"kubernetes.io/projected/8c7a222a-2ab8-449c-8d0c-b45adc03382e-kube-api-access-kcmf9\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.588612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-catalog-content\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.690905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-utilities\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.690990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcmf9\" (UniqueName: \"kubernetes.io/projected/8c7a222a-2ab8-449c-8d0c-b45adc03382e-kube-api-access-kcmf9\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.691024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-catalog-content\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.691895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-utilities\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.692305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-catalog-content\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.734248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcmf9\" (UniqueName: \"kubernetes.io/projected/8c7a222a-2ab8-449c-8d0c-b45adc03382e-kube-api-access-kcmf9\") pod \"redhat-marketplace-plsjq\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:17 crc kubenswrapper[4749]: I0310 16:37:17.761913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:18 crc kubenswrapper[4749]: I0310 16:37:18.624286 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plsjq"] Mar 10 16:37:19 crc kubenswrapper[4749]: I0310 16:37:19.269926 4749 generic.go:334] "Generic (PLEG): container finished" podID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerID="3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e" exitCode=0 Mar 10 16:37:19 crc kubenswrapper[4749]: I0310 16:37:19.269963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerDied","Data":"3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e"} Mar 10 16:37:19 crc kubenswrapper[4749]: I0310 16:37:19.269986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerStarted","Data":"b2496ee56a7813b06bb3878d57403ad5b0b96cbf691fde877e8a197c114443d0"} Mar 10 16:37:19 crc kubenswrapper[4749]: I0310 16:37:19.272654 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:37:20 crc kubenswrapper[4749]: I0310 16:37:20.278122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerStarted","Data":"50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01"} Mar 10 16:37:20 crc kubenswrapper[4749]: I0310 16:37:20.980501 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:37:20 crc kubenswrapper[4749]: I0310 16:37:20.980580 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:37:21 crc kubenswrapper[4749]: I0310 16:37:21.286619 4749 generic.go:334] "Generic (PLEG): container finished" podID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerID="50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01" exitCode=0 Mar 10 16:37:21 crc kubenswrapper[4749]: I0310 16:37:21.286670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerDied","Data":"50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01"} Mar 10 16:37:22 crc kubenswrapper[4749]: I0310 16:37:22.297043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerStarted","Data":"2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d"} Mar 10 16:37:22 crc kubenswrapper[4749]: I0310 16:37:22.320795 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-plsjq" podStartSLOduration=2.758376169 podStartE2EDuration="5.320768917s" podCreationTimestamp="2026-03-10 16:37:17 +0000 UTC" firstStartedPulling="2026-03-10 16:37:19.272364766 +0000 UTC m=+2936.394230463" lastFinishedPulling="2026-03-10 16:37:21.834757534 +0000 UTC m=+2938.956623211" observedRunningTime="2026-03-10 16:37:22.31317573 +0000 UTC m=+2939.435041437" watchObservedRunningTime="2026-03-10 16:37:22.320768917 +0000 UTC m=+2939.442634624" Mar 10 16:37:27 crc kubenswrapper[4749]: I0310 16:37:27.763385 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:27 crc kubenswrapper[4749]: I0310 16:37:27.763898 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:27 crc kubenswrapper[4749]: I0310 16:37:27.818753 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:28 crc kubenswrapper[4749]: I0310 16:37:28.388257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:28 crc kubenswrapper[4749]: I0310 16:37:28.435893 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plsjq"] Mar 10 16:37:30 crc kubenswrapper[4749]: I0310 16:37:30.352252 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-plsjq" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="registry-server" containerID="cri-o://2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d" gracePeriod=2 Mar 10 16:37:30 crc kubenswrapper[4749]: I0310 16:37:30.983435 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.139237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-utilities\") pod \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.139456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcmf9\" (UniqueName: \"kubernetes.io/projected/8c7a222a-2ab8-449c-8d0c-b45adc03382e-kube-api-access-kcmf9\") pod \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.139552 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-catalog-content\") pod \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\" (UID: \"8c7a222a-2ab8-449c-8d0c-b45adc03382e\") " Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.140520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-utilities" (OuterVolumeSpecName: "utilities") pod "8c7a222a-2ab8-449c-8d0c-b45adc03382e" (UID: "8c7a222a-2ab8-449c-8d0c-b45adc03382e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.144783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7a222a-2ab8-449c-8d0c-b45adc03382e-kube-api-access-kcmf9" (OuterVolumeSpecName: "kube-api-access-kcmf9") pod "8c7a222a-2ab8-449c-8d0c-b45adc03382e" (UID: "8c7a222a-2ab8-449c-8d0c-b45adc03382e"). InnerVolumeSpecName "kube-api-access-kcmf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.166920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c7a222a-2ab8-449c-8d0c-b45adc03382e" (UID: "8c7a222a-2ab8-449c-8d0c-b45adc03382e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.241523 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.241606 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7a222a-2ab8-449c-8d0c-b45adc03382e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.241634 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcmf9\" (UniqueName: \"kubernetes.io/projected/8c7a222a-2ab8-449c-8d0c-b45adc03382e-kube-api-access-kcmf9\") on node \"crc\" DevicePath \"\"" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.370580 4749 generic.go:334] "Generic (PLEG): container finished" podID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerID="2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d" exitCode=0 Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.370618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerDied","Data":"2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d"} Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.370641 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plsjq" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.370667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plsjq" event={"ID":"8c7a222a-2ab8-449c-8d0c-b45adc03382e","Type":"ContainerDied","Data":"b2496ee56a7813b06bb3878d57403ad5b0b96cbf691fde877e8a197c114443d0"} Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.370684 4749 scope.go:117] "RemoveContainer" containerID="2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.397862 4749 scope.go:117] "RemoveContainer" containerID="50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.400365 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plsjq"] Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.406751 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-plsjq"] Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.420551 4749 scope.go:117] "RemoveContainer" containerID="3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.433159 4749 scope.go:117] "RemoveContainer" containerID="2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d" Mar 10 16:37:31 crc kubenswrapper[4749]: E0310 16:37:31.433613 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d\": container with ID starting with 2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d not found: ID does not exist" containerID="2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.433648 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d"} err="failed to get container status \"2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d\": rpc error: code = NotFound desc = could not find container \"2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d\": container with ID starting with 2f0455a168ed985b82fa8111361125a880fd99d81f3c989ae4eaae9824ae798d not found: ID does not exist" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.433671 4749 scope.go:117] "RemoveContainer" containerID="50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01" Mar 10 16:37:31 crc kubenswrapper[4749]: E0310 16:37:31.434218 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01\": container with ID starting with 50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01 not found: ID does not exist" containerID="50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.434246 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01"} err="failed to get container status \"50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01\": rpc error: code = NotFound desc = could not find container \"50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01\": container with ID starting with 50a95643adf67125ff6013f880cd322e40898da8c67ba5284ce1d9c1d457ed01 not found: ID does not exist" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.434264 4749 scope.go:117] "RemoveContainer" containerID="3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e" Mar 10 16:37:31 crc kubenswrapper[4749]: E0310 16:37:31.434639 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e\": container with ID starting with 3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e not found: ID does not exist" containerID="3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.434740 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e"} err="failed to get container status \"3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e\": rpc error: code = NotFound desc = could not find container \"3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e\": container with ID starting with 3e07dad468e5e48e3f5d44d4929e80f8538181b5dd9391772fa68ccbcf46a04e not found: ID does not exist" Mar 10 16:37:31 crc kubenswrapper[4749]: I0310 16:37:31.615552 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" path="/var/lib/kubelet/pods/8c7a222a-2ab8-449c-8d0c-b45adc03382e/volumes" Mar 10 16:37:50 crc kubenswrapper[4749]: I0310 16:37:50.980732 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:37:50 crc kubenswrapper[4749]: I0310 16:37:50.981598 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.147620 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552678-x7mlg"] Mar 10 16:38:00 crc kubenswrapper[4749]: E0310 16:38:00.148490 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="extract-content" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.148507 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="extract-content" Mar 10 16:38:00 crc kubenswrapper[4749]: E0310 16:38:00.148517 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="extract-utilities" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.148524 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="extract-utilities" Mar 10 16:38:00 crc kubenswrapper[4749]: E0310 16:38:00.148555 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="registry-server" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.148564 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="registry-server" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.148729 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7a222a-2ab8-449c-8d0c-b45adc03382e" containerName="registry-server" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.149326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.151705 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.151727 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.152200 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.156945 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552678-x7mlg"] Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.266921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqs6\" (UniqueName: \"kubernetes.io/projected/392fc373-288c-4653-9ea8-8bd54d5deac2-kube-api-access-2sqs6\") pod \"auto-csr-approver-29552678-x7mlg\" (UID: \"392fc373-288c-4653-9ea8-8bd54d5deac2\") " pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.368330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqs6\" (UniqueName: \"kubernetes.io/projected/392fc373-288c-4653-9ea8-8bd54d5deac2-kube-api-access-2sqs6\") pod \"auto-csr-approver-29552678-x7mlg\" (UID: \"392fc373-288c-4653-9ea8-8bd54d5deac2\") " pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.404332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqs6\" (UniqueName: \"kubernetes.io/projected/392fc373-288c-4653-9ea8-8bd54d5deac2-kube-api-access-2sqs6\") pod \"auto-csr-approver-29552678-x7mlg\" (UID: \"392fc373-288c-4653-9ea8-8bd54d5deac2\") " pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.469539 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:00 crc kubenswrapper[4749]: I0310 16:38:00.924856 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552678-x7mlg"] Mar 10 16:38:00 crc kubenswrapper[4749]: W0310 16:38:00.929493 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392fc373_288c_4653_9ea8_8bd54d5deac2.slice/crio-7626b7a3eea2ec70e7508d74697ea4046286bd5aeb3fd0ccd16d0525ed2543c5 WatchSource:0}: Error finding container 7626b7a3eea2ec70e7508d74697ea4046286bd5aeb3fd0ccd16d0525ed2543c5: Status 404 returned error can't find the container with id 7626b7a3eea2ec70e7508d74697ea4046286bd5aeb3fd0ccd16d0525ed2543c5 Mar 10 16:38:01 crc kubenswrapper[4749]: I0310 16:38:01.619071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" event={"ID":"392fc373-288c-4653-9ea8-8bd54d5deac2","Type":"ContainerStarted","Data":"7626b7a3eea2ec70e7508d74697ea4046286bd5aeb3fd0ccd16d0525ed2543c5"} Mar 10 16:38:02 crc kubenswrapper[4749]: I0310 16:38:02.630711 4749 generic.go:334] "Generic (PLEG): container finished" podID="392fc373-288c-4653-9ea8-8bd54d5deac2" containerID="9a9eed78cf502edd34f2d4b1faae36ed06f5d98e6ed03e6e4a641f8cc56af325" exitCode=0 Mar 10 16:38:02 crc kubenswrapper[4749]: I0310 16:38:02.630776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" event={"ID":"392fc373-288c-4653-9ea8-8bd54d5deac2","Type":"ContainerDied","Data":"9a9eed78cf502edd34f2d4b1faae36ed06f5d98e6ed03e6e4a641f8cc56af325"} Mar 10 16:38:03 crc kubenswrapper[4749]: I0310 16:38:03.908138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.024439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqs6\" (UniqueName: \"kubernetes.io/projected/392fc373-288c-4653-9ea8-8bd54d5deac2-kube-api-access-2sqs6\") pod \"392fc373-288c-4653-9ea8-8bd54d5deac2\" (UID: \"392fc373-288c-4653-9ea8-8bd54d5deac2\") " Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.030457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392fc373-288c-4653-9ea8-8bd54d5deac2-kube-api-access-2sqs6" (OuterVolumeSpecName: "kube-api-access-2sqs6") pod "392fc373-288c-4653-9ea8-8bd54d5deac2" (UID: "392fc373-288c-4653-9ea8-8bd54d5deac2"). InnerVolumeSpecName "kube-api-access-2sqs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.125874 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sqs6\" (UniqueName: \"kubernetes.io/projected/392fc373-288c-4653-9ea8-8bd54d5deac2-kube-api-access-2sqs6\") on node \"crc\" DevicePath \"\"" Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.647778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" event={"ID":"392fc373-288c-4653-9ea8-8bd54d5deac2","Type":"ContainerDied","Data":"7626b7a3eea2ec70e7508d74697ea4046286bd5aeb3fd0ccd16d0525ed2543c5"} Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.647879 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7626b7a3eea2ec70e7508d74697ea4046286bd5aeb3fd0ccd16d0525ed2543c5" Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.647817 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552678-x7mlg" Mar 10 16:38:04 crc kubenswrapper[4749]: I0310 16:38:04.987514 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-lnz9l"] Mar 10 16:38:05 crc kubenswrapper[4749]: I0310 16:38:05.009326 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552672-lnz9l"] Mar 10 16:38:05 crc kubenswrapper[4749]: I0310 16:38:05.621547 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2078353-2d1d-42fa-ba1b-30613a1a4007" path="/var/lib/kubelet/pods/c2078353-2d1d-42fa-ba1b-30613a1a4007/volumes" Mar 10 16:38:20 crc kubenswrapper[4749]: I0310 16:38:20.980678 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:38:20 crc kubenswrapper[4749]: I0310 16:38:20.981197 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:38:20 crc kubenswrapper[4749]: I0310 16:38:20.981247 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:38:20 crc kubenswrapper[4749]: I0310 16:38:20.982076 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"947b4a95f3f3dad04ce41601e5ebbfe62834f04cf5e2c3a475ba5da266ed6956"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:38:20 crc kubenswrapper[4749]: I0310 16:38:20.982146 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://947b4a95f3f3dad04ce41601e5ebbfe62834f04cf5e2c3a475ba5da266ed6956" gracePeriod=600 Mar 10 16:38:21 crc kubenswrapper[4749]: I0310 16:38:21.784586 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="947b4a95f3f3dad04ce41601e5ebbfe62834f04cf5e2c3a475ba5da266ed6956" exitCode=0 Mar 10 16:38:21 crc kubenswrapper[4749]: I0310 16:38:21.784682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"947b4a95f3f3dad04ce41601e5ebbfe62834f04cf5e2c3a475ba5da266ed6956"} Mar 10 16:38:21 crc kubenswrapper[4749]: I0310 16:38:21.784946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373"} Mar 10 16:38:21 crc kubenswrapper[4749]: I0310 16:38:21.784983 4749 scope.go:117] "RemoveContainer" containerID="90b320febd1475a23632297f086a4babda9e574927576c1ac94312b2f662625a" Mar 10 16:38:24 crc kubenswrapper[4749]: I0310 16:38:24.981791 4749 scope.go:117] "RemoveContainer" containerID="e0956bdfa5a4dae1084c827e13b622fbf4334b0ae3e06ef0b3b842fb6b079f7d" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.138456 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552680-krnvg"] Mar 10 16:40:00 crc kubenswrapper[4749]: E0310 16:40:00.139475 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fc373-288c-4653-9ea8-8bd54d5deac2" containerName="oc" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.139492 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fc373-288c-4653-9ea8-8bd54d5deac2" containerName="oc" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.139659 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fc373-288c-4653-9ea8-8bd54d5deac2" containerName="oc" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.140211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.142433 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.142792 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.145875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wzz\" (UniqueName: \"kubernetes.io/projected/224459f3-ed20-441b-9a6a-1a44f29c02ba-kube-api-access-w4wzz\") pod \"auto-csr-approver-29552680-krnvg\" (UID: \"224459f3-ed20-441b-9a6a-1a44f29c02ba\") " pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.146460 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552680-krnvg"] Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.148668 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.246928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wzz\" (UniqueName: \"kubernetes.io/projected/224459f3-ed20-441b-9a6a-1a44f29c02ba-kube-api-access-w4wzz\") pod \"auto-csr-approver-29552680-krnvg\" (UID: \"224459f3-ed20-441b-9a6a-1a44f29c02ba\") " pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.265711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wzz\" (UniqueName: \"kubernetes.io/projected/224459f3-ed20-441b-9a6a-1a44f29c02ba-kube-api-access-w4wzz\") pod \"auto-csr-approver-29552680-krnvg\" (UID: \"224459f3-ed20-441b-9a6a-1a44f29c02ba\") " pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.466352 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:00 crc kubenswrapper[4749]: I0310 16:40:00.925883 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552680-krnvg"] Mar 10 16:40:01 crc kubenswrapper[4749]: I0310 16:40:01.582062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552680-krnvg" event={"ID":"224459f3-ed20-441b-9a6a-1a44f29c02ba","Type":"ContainerStarted","Data":"e7e97414f97a4761a8305c1c7087398d500dd0734656b64bff97cab5e16918ef"} Mar 10 16:40:03 crc kubenswrapper[4749]: I0310 16:40:03.601065 4749 generic.go:334] "Generic (PLEG): container finished" podID="224459f3-ed20-441b-9a6a-1a44f29c02ba" containerID="ba8bc40e74c59d2cc35ba98df4a7c80dcc8e460d0d0262fe67c001484432df17" exitCode=0 Mar 10 16:40:03 crc kubenswrapper[4749]: I0310 16:40:03.601166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552680-krnvg" event={"ID":"224459f3-ed20-441b-9a6a-1a44f29c02ba","Type":"ContainerDied","Data":"ba8bc40e74c59d2cc35ba98df4a7c80dcc8e460d0d0262fe67c001484432df17"} Mar 10 16:40:04 crc kubenswrapper[4749]: I0310 16:40:04.917099 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.016401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4wzz\" (UniqueName: \"kubernetes.io/projected/224459f3-ed20-441b-9a6a-1a44f29c02ba-kube-api-access-w4wzz\") pod \"224459f3-ed20-441b-9a6a-1a44f29c02ba\" (UID: \"224459f3-ed20-441b-9a6a-1a44f29c02ba\") " Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.021481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/224459f3-ed20-441b-9a6a-1a44f29c02ba-kube-api-access-w4wzz" (OuterVolumeSpecName: "kube-api-access-w4wzz") pod "224459f3-ed20-441b-9a6a-1a44f29c02ba" (UID: "224459f3-ed20-441b-9a6a-1a44f29c02ba"). InnerVolumeSpecName "kube-api-access-w4wzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.117799 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4wzz\" (UniqueName: \"kubernetes.io/projected/224459f3-ed20-441b-9a6a-1a44f29c02ba-kube-api-access-w4wzz\") on node \"crc\" DevicePath \"\"" Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.630782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552680-krnvg" event={"ID":"224459f3-ed20-441b-9a6a-1a44f29c02ba","Type":"ContainerDied","Data":"e7e97414f97a4761a8305c1c7087398d500dd0734656b64bff97cab5e16918ef"} Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.631065 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e97414f97a4761a8305c1c7087398d500dd0734656b64bff97cab5e16918ef" Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.630828 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552680-krnvg" Mar 10 16:40:05 crc kubenswrapper[4749]: I0310 16:40:05.996101 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552674-gfkpt"] Mar 10 16:40:06 crc kubenswrapper[4749]: I0310 16:40:06.008479 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552674-gfkpt"] Mar 10 16:40:07 crc kubenswrapper[4749]: I0310 16:40:07.620633 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4925b1-7b8c-470b-96b5-c88613db9212" path="/var/lib/kubelet/pods/df4925b1-7b8c-470b-96b5-c88613db9212/volumes" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.825940 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnjbh"] Mar 10 16:40:15 crc kubenswrapper[4749]: E0310 16:40:15.827355 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224459f3-ed20-441b-9a6a-1a44f29c02ba" containerName="oc" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.827399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="224459f3-ed20-441b-9a6a-1a44f29c02ba" containerName="oc" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.827795 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="224459f3-ed20-441b-9a6a-1a44f29c02ba" containerName="oc" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.863962 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnjbh"] Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.864227 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.890900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2bq\" (UniqueName: \"kubernetes.io/projected/e821d321-2c59-4b5b-9f89-e8a6d1371b69-kube-api-access-kn2bq\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.891079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-catalog-content\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.891132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-utilities\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.993076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-utilities\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.993193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2bq\" (UniqueName: \"kubernetes.io/projected/e821d321-2c59-4b5b-9f89-e8a6d1371b69-kube-api-access-kn2bq\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.993322 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-catalog-content\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.993816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-utilities\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:15 crc kubenswrapper[4749]: I0310 16:40:15.993920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-catalog-content\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:16 crc kubenswrapper[4749]: I0310 16:40:16.015061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2bq\" (UniqueName: \"kubernetes.io/projected/e821d321-2c59-4b5b-9f89-e8a6d1371b69-kube-api-access-kn2bq\") pod \"community-operators-lnjbh\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:16 crc kubenswrapper[4749]: I0310 16:40:16.194201 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:16 crc kubenswrapper[4749]: I0310 16:40:16.630446 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnjbh"] Mar 10 16:40:16 crc kubenswrapper[4749]: I0310 16:40:16.705771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnjbh" event={"ID":"e821d321-2c59-4b5b-9f89-e8a6d1371b69","Type":"ContainerStarted","Data":"0a5bc50a5de6a513af627b7f9d165df86e45545a49fd12245852929509fdc15f"} Mar 10 16:40:17 crc kubenswrapper[4749]: I0310 16:40:17.715720 4749 generic.go:334] "Generic (PLEG): container finished" podID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerID="a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3" exitCode=0 Mar 10 16:40:17 crc kubenswrapper[4749]: I0310 16:40:17.715777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnjbh" event={"ID":"e821d321-2c59-4b5b-9f89-e8a6d1371b69","Type":"ContainerDied","Data":"a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3"} Mar 10 16:40:19 crc kubenswrapper[4749]: I0310 16:40:19.736128 4749 generic.go:334] "Generic (PLEG): container finished" podID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerID="a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202" exitCode=0 Mar 10 16:40:19 crc kubenswrapper[4749]: I0310 16:40:19.736208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnjbh" event={"ID":"e821d321-2c59-4b5b-9f89-e8a6d1371b69","Type":"ContainerDied","Data":"a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202"} Mar 10 16:40:20 crc kubenswrapper[4749]: I0310 16:40:20.747311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnjbh" event={"ID":"e821d321-2c59-4b5b-9f89-e8a6d1371b69","Type":"ContainerStarted","Data":"550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2"} Mar 10 16:40:20 crc kubenswrapper[4749]: I0310 16:40:20.772075 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnjbh" podStartSLOduration=3.081522147 podStartE2EDuration="5.772055902s" podCreationTimestamp="2026-03-10 16:40:15 +0000 UTC" firstStartedPulling="2026-03-10 16:40:17.718167971 +0000 UTC m=+3114.840033688" lastFinishedPulling="2026-03-10 16:40:20.408701716 +0000 UTC m=+3117.530567443" observedRunningTime="2026-03-10 16:40:20.769245595 +0000 UTC m=+3117.891111292" watchObservedRunningTime="2026-03-10 16:40:20.772055902 +0000 UTC m=+3117.893921599" Mar 10 16:40:25 crc kubenswrapper[4749]: I0310 16:40:25.091385 4749 scope.go:117] "RemoveContainer" containerID="fad3ba586d6293eb06ff5de002ea67771ca69436c648f0ca2fc617c4c71fe652" Mar 10 16:40:26 crc kubenswrapper[4749]: I0310 16:40:26.195457 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:26 crc kubenswrapper[4749]: I0310 16:40:26.195646 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:26 crc kubenswrapper[4749]: I0310 16:40:26.278609 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:26 crc kubenswrapper[4749]: I0310 16:40:26.870784 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:26 crc kubenswrapper[4749]: I0310 16:40:26.930648 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnjbh"] Mar 10 16:40:28 crc kubenswrapper[4749]: I0310 16:40:28.808974 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnjbh" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="registry-server" containerID="cri-o://550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2" gracePeriod=2 Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.209973 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.386002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn2bq\" (UniqueName: \"kubernetes.io/projected/e821d321-2c59-4b5b-9f89-e8a6d1371b69-kube-api-access-kn2bq\") pod \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.386287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-catalog-content\") pod \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.386335 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-utilities\") pod \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\" (UID: \"e821d321-2c59-4b5b-9f89-e8a6d1371b69\") " Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.388130 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-utilities" (OuterVolumeSpecName: "utilities") pod "e821d321-2c59-4b5b-9f89-e8a6d1371b69" (UID: "e821d321-2c59-4b5b-9f89-e8a6d1371b69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.392613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e821d321-2c59-4b5b-9f89-e8a6d1371b69-kube-api-access-kn2bq" (OuterVolumeSpecName: "kube-api-access-kn2bq") pod "e821d321-2c59-4b5b-9f89-e8a6d1371b69" (UID: "e821d321-2c59-4b5b-9f89-e8a6d1371b69"). InnerVolumeSpecName "kube-api-access-kn2bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.443779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e821d321-2c59-4b5b-9f89-e8a6d1371b69" (UID: "e821d321-2c59-4b5b-9f89-e8a6d1371b69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.488290 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.488344 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e821d321-2c59-4b5b-9f89-e8a6d1371b69-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.488358 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn2bq\" (UniqueName: \"kubernetes.io/projected/e821d321-2c59-4b5b-9f89-e8a6d1371b69-kube-api-access-kn2bq\") on node \"crc\" DevicePath \"\"" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.821898 4749 generic.go:334] "Generic (PLEG): container finished" podID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerID="550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2" exitCode=0 Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.821956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnjbh" event={"ID":"e821d321-2c59-4b5b-9f89-e8a6d1371b69","Type":"ContainerDied","Data":"550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2"} Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.822504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnjbh" event={"ID":"e821d321-2c59-4b5b-9f89-e8a6d1371b69","Type":"ContainerDied","Data":"0a5bc50a5de6a513af627b7f9d165df86e45545a49fd12245852929509fdc15f"} Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.822539 4749 scope.go:117] "RemoveContainer" containerID="550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.822196 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnjbh" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.852123 4749 scope.go:117] "RemoveContainer" containerID="a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.852287 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnjbh"] Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.856267 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnjbh"] Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.873537 4749 scope.go:117] "RemoveContainer" containerID="a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.889671 4749 scope.go:117] "RemoveContainer" containerID="550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2" Mar 10 16:40:29 crc kubenswrapper[4749]: E0310 16:40:29.890250 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2\": container with ID starting with 550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2 not found: ID does not exist" containerID="550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.890296 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2"} err="failed to get container status \"550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2\": rpc error: code = NotFound desc = could not find container \"550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2\": container with ID starting with 550e60e0d97a833a186d728ef0efec79461ff5244e3fad730ec3783c325377c2 not found: ID does not exist" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.890331 4749 scope.go:117] "RemoveContainer" containerID="a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202" Mar 10 16:40:29 crc kubenswrapper[4749]: E0310 16:40:29.890868 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202\": container with ID starting with a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202 not found: ID does not exist" containerID="a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.890898 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202"} err="failed to get container status \"a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202\": rpc error: code = NotFound desc = could not find container \"a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202\": container with ID starting with a7042c868569e26c5864072da0011c081ad1e56467f736d0f1b1d513e82e5202 not found: ID does not exist" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.890917 4749 scope.go:117] "RemoveContainer" containerID="a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3" Mar 10 16:40:29 crc kubenswrapper[4749]: E0310 16:40:29.891148 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3\": container with ID starting with a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3 not found: ID does not exist" containerID="a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3" Mar 10 16:40:29 crc kubenswrapper[4749]: I0310 16:40:29.891177 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3"} err="failed to get container status \"a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3\": rpc error: code = NotFound desc = could not find container \"a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3\": container with ID starting with a8e23ed309796d83ed42d69a6037f141e2a22b0255810f339b074e7de145e8f3 not found: ID does not exist" Mar 10 16:40:31 crc kubenswrapper[4749]: I0310 16:40:31.617406 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" path="/var/lib/kubelet/pods/e821d321-2c59-4b5b-9f89-e8a6d1371b69/volumes" Mar 10 16:40:50 crc kubenswrapper[4749]: I0310 16:40:50.981078 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:40:50 crc kubenswrapper[4749]: I0310 16:40:50.981757 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:41:20 crc kubenswrapper[4749]: I0310 16:41:20.981258 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:41:20 crc kubenswrapper[4749]: I0310 16:41:20.982005 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:41:50 crc kubenswrapper[4749]: I0310 16:41:50.980023 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:41:50 crc kubenswrapper[4749]: I0310 16:41:50.980645 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:41:50 crc kubenswrapper[4749]: I0310 16:41:50.980704 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:41:50 crc kubenswrapper[4749]: I0310 16:41:50.981426 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:41:50 crc kubenswrapper[4749]: I0310 16:41:50.981491 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" gracePeriod=600 Mar 10 16:41:51 crc kubenswrapper[4749]: E0310 16:41:51.104168 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:41:51 crc kubenswrapper[4749]: I0310 16:41:51.476273 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" exitCode=0 Mar 10 16:41:51 crc kubenswrapper[4749]: I0310 16:41:51.476324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373"} Mar 10 16:41:51 crc kubenswrapper[4749]: I0310 16:41:51.476360 4749 scope.go:117] "RemoveContainer" containerID="947b4a95f3f3dad04ce41601e5ebbfe62834f04cf5e2c3a475ba5da266ed6956" Mar 10 16:41:51 crc kubenswrapper[4749]: I0310 16:41:51.477082 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:41:51 crc kubenswrapper[4749]: E0310 16:41:51.477597 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.143727 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552682-vwjmc"] Mar 10 16:42:00 crc kubenswrapper[4749]: E0310 16:42:00.144623 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="extract-utilities" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.144638 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="extract-utilities" Mar 10 16:42:00 crc kubenswrapper[4749]: E0310 16:42:00.144649 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="registry-server" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.144656 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="registry-server" Mar 10 16:42:00 crc kubenswrapper[4749]: E0310 16:42:00.144682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="extract-content" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.144691 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="extract-content" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.144845 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e821d321-2c59-4b5b-9f89-e8a6d1371b69" containerName="registry-server" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.145645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.148649 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.149274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.150737 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.153937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552682-vwjmc"] Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.235245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xdvv\" (UniqueName: \"kubernetes.io/projected/c6be2358-2dde-4390-aea2-bae8b848fa78-kube-api-access-5xdvv\") pod \"auto-csr-approver-29552682-vwjmc\" (UID: \"c6be2358-2dde-4390-aea2-bae8b848fa78\") " pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.336523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xdvv\" (UniqueName: \"kubernetes.io/projected/c6be2358-2dde-4390-aea2-bae8b848fa78-kube-api-access-5xdvv\") pod \"auto-csr-approver-29552682-vwjmc\" (UID: \"c6be2358-2dde-4390-aea2-bae8b848fa78\") " pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.353903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xdvv\" (UniqueName: \"kubernetes.io/projected/c6be2358-2dde-4390-aea2-bae8b848fa78-kube-api-access-5xdvv\") pod \"auto-csr-approver-29552682-vwjmc\" (UID: \"c6be2358-2dde-4390-aea2-bae8b848fa78\") " pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.466747 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:00 crc kubenswrapper[4749]: I0310 16:42:00.867876 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552682-vwjmc"] Mar 10 16:42:01 crc kubenswrapper[4749]: I0310 16:42:01.557624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" event={"ID":"c6be2358-2dde-4390-aea2-bae8b848fa78","Type":"ContainerStarted","Data":"39aef561c15a5c1d5e0918c042e5bf29dd085059d8f2b6277e403909cdc7cbf0"} Mar 10 16:42:03 crc kubenswrapper[4749]: I0310 16:42:03.580993 4749 generic.go:334] "Generic (PLEG): container finished" podID="c6be2358-2dde-4390-aea2-bae8b848fa78" containerID="cda9e59371b7d15e8dca49490434a191fc9f7846103e774702b1068a768d59c4" exitCode=0 Mar 10 16:42:03 crc kubenswrapper[4749]: I0310 16:42:03.581067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" event={"ID":"c6be2358-2dde-4390-aea2-bae8b848fa78","Type":"ContainerDied","Data":"cda9e59371b7d15e8dca49490434a191fc9f7846103e774702b1068a768d59c4"} Mar 10 16:42:04 crc kubenswrapper[4749]: I0310 16:42:04.606472 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:42:04 crc kubenswrapper[4749]: E0310 16:42:04.607479 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:42:04 crc kubenswrapper[4749]: I0310 16:42:04.834279 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.004364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xdvv\" (UniqueName: \"kubernetes.io/projected/c6be2358-2dde-4390-aea2-bae8b848fa78-kube-api-access-5xdvv\") pod \"c6be2358-2dde-4390-aea2-bae8b848fa78\" (UID: \"c6be2358-2dde-4390-aea2-bae8b848fa78\") " Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.013671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6be2358-2dde-4390-aea2-bae8b848fa78-kube-api-access-5xdvv" (OuterVolumeSpecName: "kube-api-access-5xdvv") pod "c6be2358-2dde-4390-aea2-bae8b848fa78" (UID: "c6be2358-2dde-4390-aea2-bae8b848fa78"). InnerVolumeSpecName "kube-api-access-5xdvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.105393 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xdvv\" (UniqueName: \"kubernetes.io/projected/c6be2358-2dde-4390-aea2-bae8b848fa78-kube-api-access-5xdvv\") on node \"crc\" DevicePath \"\"" Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.594018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" event={"ID":"c6be2358-2dde-4390-aea2-bae8b848fa78","Type":"ContainerDied","Data":"39aef561c15a5c1d5e0918c042e5bf29dd085059d8f2b6277e403909cdc7cbf0"} Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.594356 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39aef561c15a5c1d5e0918c042e5bf29dd085059d8f2b6277e403909cdc7cbf0" Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.594056 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552682-vwjmc" Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.901346 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552676-9gtl7"] Mar 10 16:42:05 crc kubenswrapper[4749]: I0310 16:42:05.907991 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552676-9gtl7"] Mar 10 16:42:07 crc kubenswrapper[4749]: I0310 16:42:07.618459 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3c3915-48bd-43ae-a1da-e74e53b1ec0e" path="/var/lib/kubelet/pods/2c3c3915-48bd-43ae-a1da-e74e53b1ec0e/volumes" Mar 10 16:42:18 crc kubenswrapper[4749]: I0310 16:42:18.607092 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:42:18 crc kubenswrapper[4749]: E0310 16:42:18.607881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:42:25 crc kubenswrapper[4749]: I0310 16:42:25.219573 4749 scope.go:117] "RemoveContainer" containerID="daad1e35937d991d7845f0e8f65eaa4c64be90b5526f5fe3b9c916ba9325da12" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.326741 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbjc9"] Mar 10 16:42:29 crc kubenswrapper[4749]: E0310 16:42:29.327691 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6be2358-2dde-4390-aea2-bae8b848fa78" containerName="oc" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.327709 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6be2358-2dde-4390-aea2-bae8b848fa78" containerName="oc" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.327884 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6be2358-2dde-4390-aea2-bae8b848fa78" containerName="oc" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.329068 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.342234 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbjc9"] Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.474259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-catalog-content\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.474330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-utilities\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.474360 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrxl\" (UniqueName: \"kubernetes.io/projected/e3b4520e-9700-46fd-92fe-a7d163a5fd76-kube-api-access-lmrxl\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.576315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-catalog-content\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.576407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-utilities\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.576442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrxl\" (UniqueName: \"kubernetes.io/projected/e3b4520e-9700-46fd-92fe-a7d163a5fd76-kube-api-access-lmrxl\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.577268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-catalog-content\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.577575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-utilities\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.603687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrxl\" (UniqueName: \"kubernetes.io/projected/e3b4520e-9700-46fd-92fe-a7d163a5fd76-kube-api-access-lmrxl\") pod \"certified-operators-wbjc9\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.609279 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:42:29 crc kubenswrapper[4749]: E0310 16:42:29.609861 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:42:29 crc kubenswrapper[4749]: I0310 16:42:29.648860 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:30 crc kubenswrapper[4749]: I0310 16:42:30.137389 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbjc9"] Mar 10 16:42:30 crc kubenswrapper[4749]: I0310 16:42:30.795261 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerID="db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df" exitCode=0 Mar 10 16:42:30 crc kubenswrapper[4749]: I0310 16:42:30.795353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerDied","Data":"db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df"} Mar 10 16:42:30 crc kubenswrapper[4749]: I0310 16:42:30.795667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerStarted","Data":"df7fc7adadc05a164394f25461f6333f32c5db17856f25fddc742e1f3e9a700a"} Mar 10 16:42:30 crc kubenswrapper[4749]: I0310 16:42:30.797323 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:42:31 crc kubenswrapper[4749]: I0310 16:42:31.802842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerStarted","Data":"85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c"} Mar 10 16:42:32 crc kubenswrapper[4749]: I0310 16:42:32.815430 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerID="85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c" exitCode=0 Mar 10 16:42:32 crc kubenswrapper[4749]: I0310 16:42:32.815489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerDied","Data":"85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c"} Mar 10 16:42:33 crc kubenswrapper[4749]: I0310 16:42:33.825911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerStarted","Data":"79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35"} Mar 10 16:42:33 crc kubenswrapper[4749]: I0310 16:42:33.842650 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbjc9" podStartSLOduration=2.31350208 podStartE2EDuration="4.842631102s" podCreationTimestamp="2026-03-10 16:42:29 +0000 UTC" firstStartedPulling="2026-03-10 16:42:30.796953926 +0000 UTC m=+3247.918819613" lastFinishedPulling="2026-03-10 16:42:33.326082948 +0000 UTC m=+3250.447948635" observedRunningTime="2026-03-10 16:42:33.840652308 +0000 UTC m=+3250.962518015" watchObservedRunningTime="2026-03-10 16:42:33.842631102 +0000 UTC m=+3250.964496789" Mar 10 16:42:39 crc kubenswrapper[4749]: I0310 16:42:39.649188 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:39 crc kubenswrapper[4749]: I0310 16:42:39.650167 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:39 crc kubenswrapper[4749]: I0310 16:42:39.706066 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:39 crc kubenswrapper[4749]: I0310 16:42:39.904189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:39 crc kubenswrapper[4749]: I0310 16:42:39.960275 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbjc9"] Mar 10 16:42:41 crc kubenswrapper[4749]: I0310 16:42:41.606492 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:42:41 crc kubenswrapper[4749]: E0310 16:42:41.606776 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:42:41 crc kubenswrapper[4749]: I0310 16:42:41.878966 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbjc9" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="registry-server" containerID="cri-o://79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35" gracePeriod=2 Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.837696 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.886949 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerID="79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35" exitCode=0 Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.887003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerDied","Data":"79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35"} Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.887034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbjc9" event={"ID":"e3b4520e-9700-46fd-92fe-a7d163a5fd76","Type":"ContainerDied","Data":"df7fc7adadc05a164394f25461f6333f32c5db17856f25fddc742e1f3e9a700a"} Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.887056 4749 scope.go:117] "RemoveContainer" containerID="79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.887208 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbjc9" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.903096 4749 scope.go:117] "RemoveContainer" containerID="85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.918329 4749 scope.go:117] "RemoveContainer" containerID="db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.949008 4749 scope.go:117] "RemoveContainer" containerID="79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35" Mar 10 16:42:42 crc kubenswrapper[4749]: E0310 16:42:42.949670 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35\": container with ID starting with 79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35 not found: ID does not exist" containerID="79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.949701 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35"} err="failed to get container status \"79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35\": rpc error: code = NotFound desc = could not find container \"79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35\": container with ID starting with 79088c648caf157b4418b12c6109f730d31e1488b69f266fa5b90e1de170ca35 not found: ID does not exist" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.949731 4749 scope.go:117] "RemoveContainer" containerID="85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c" Mar 10 16:42:42 crc kubenswrapper[4749]: E0310 16:42:42.950159 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c\": container with ID starting with 85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c not found: ID does not exist" containerID="85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.950183 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c"} err="failed to get container status \"85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c\": rpc error: code = NotFound desc = could not find container \"85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c\": container with ID starting with 85ec5128298a99823bc25403bef5d88da5a76f1273d8d81830c3e0fae56c284c not found: ID does not exist" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.950201 4749 scope.go:117] "RemoveContainer" containerID="db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df" Mar 10 16:42:42 crc kubenswrapper[4749]: E0310 16:42:42.950611 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df\": container with ID starting with db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df not found: ID does not exist" containerID="db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.950669 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df"} err="failed to get container status \"db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df\": rpc error: code = NotFound desc = could not find container \"db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df\": container with ID starting with db8719f023177541c7bd194f5a0ccedbd39b4cefc0f0bbdc958e0a871a9859df not found: ID does not exist" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.989647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-catalog-content\") pod \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.989787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmrxl\" (UniqueName: \"kubernetes.io/projected/e3b4520e-9700-46fd-92fe-a7d163a5fd76-kube-api-access-lmrxl\") pod \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.989872 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-utilities\") pod \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\" (UID: \"e3b4520e-9700-46fd-92fe-a7d163a5fd76\") " Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.990872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-utilities" (OuterVolumeSpecName: "utilities") pod "e3b4520e-9700-46fd-92fe-a7d163a5fd76" (UID: "e3b4520e-9700-46fd-92fe-a7d163a5fd76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:42:42 crc kubenswrapper[4749]: I0310 16:42:42.995589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b4520e-9700-46fd-92fe-a7d163a5fd76-kube-api-access-lmrxl" (OuterVolumeSpecName: "kube-api-access-lmrxl") pod "e3b4520e-9700-46fd-92fe-a7d163a5fd76" (UID: "e3b4520e-9700-46fd-92fe-a7d163a5fd76"). InnerVolumeSpecName "kube-api-access-lmrxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.046259 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3b4520e-9700-46fd-92fe-a7d163a5fd76" (UID: "e3b4520e-9700-46fd-92fe-a7d163a5fd76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.091398 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.091429 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmrxl\" (UniqueName: \"kubernetes.io/projected/e3b4520e-9700-46fd-92fe-a7d163a5fd76-kube-api-access-lmrxl\") on node \"crc\" DevicePath \"\"" Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.091441 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b4520e-9700-46fd-92fe-a7d163a5fd76-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.223027 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbjc9"] Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.234267 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbjc9"] Mar 10 16:42:43 crc kubenswrapper[4749]: I0310 16:42:43.618238 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" path="/var/lib/kubelet/pods/e3b4520e-9700-46fd-92fe-a7d163a5fd76/volumes" Mar 10 16:42:53 crc kubenswrapper[4749]: I0310 16:42:53.611829 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:42:53 crc kubenswrapper[4749]: E0310 16:42:53.612648 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:43:07 crc kubenswrapper[4749]: I0310 16:43:07.607247 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:43:07 crc kubenswrapper[4749]: E0310 16:43:07.608065 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:43:22 crc kubenswrapper[4749]: I0310 16:43:22.606640 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:43:22 crc kubenswrapper[4749]: E0310 16:43:22.607830 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:43:36 crc kubenswrapper[4749]: I0310 16:43:36.607005 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:43:36 crc kubenswrapper[4749]: E0310 16:43:36.607712 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:43:50 crc kubenswrapper[4749]: I0310 16:43:50.606851 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:43:50 crc kubenswrapper[4749]: E0310 16:43:50.607737 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.185283 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552684-5pt8r"] Mar 10 16:44:00 crc kubenswrapper[4749]: E0310 16:44:00.186085 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="registry-server" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.186098 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="registry-server" Mar 10 16:44:00 crc kubenswrapper[4749]: E0310 16:44:00.186110 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="extract-content" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.186116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="extract-content" Mar 10 16:44:00 crc kubenswrapper[4749]: E0310 16:44:00.186127 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="extract-utilities" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.186133 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="extract-utilities" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.186277 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b4520e-9700-46fd-92fe-a7d163a5fd76" containerName="registry-server" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.186753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.190437 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.190636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.191737 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.199034 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552684-5pt8r"] Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.301395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vs9\" (UniqueName: \"kubernetes.io/projected/b94a1492-616a-49e7-818e-9fc0533f91ce-kube-api-access-z2vs9\") pod \"auto-csr-approver-29552684-5pt8r\" (UID: \"b94a1492-616a-49e7-818e-9fc0533f91ce\") " pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.403499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2vs9\" (UniqueName: \"kubernetes.io/projected/b94a1492-616a-49e7-818e-9fc0533f91ce-kube-api-access-z2vs9\") pod \"auto-csr-approver-29552684-5pt8r\" (UID: \"b94a1492-616a-49e7-818e-9fc0533f91ce\") " pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.421591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2vs9\" (UniqueName: \"kubernetes.io/projected/b94a1492-616a-49e7-818e-9fc0533f91ce-kube-api-access-z2vs9\") pod \"auto-csr-approver-29552684-5pt8r\" (UID: \"b94a1492-616a-49e7-818e-9fc0533f91ce\") " pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.512104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:00 crc kubenswrapper[4749]: I0310 16:44:00.941074 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552684-5pt8r"] Mar 10 16:44:01 crc kubenswrapper[4749]: I0310 16:44:01.448893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" event={"ID":"b94a1492-616a-49e7-818e-9fc0533f91ce","Type":"ContainerStarted","Data":"12d7b9139fb70a6952d34fe34ce95fa68043232cc50578438940254103ed2af4"} Mar 10 16:44:02 crc kubenswrapper[4749]: I0310 16:44:02.459763 4749 generic.go:334] "Generic (PLEG): container finished" podID="b94a1492-616a-49e7-818e-9fc0533f91ce" containerID="a19acd38052fc166a2850d7b7b5969ebd89a681208074c7dc70a0b2ba1337ed3" exitCode=0 Mar 10 16:44:02 crc kubenswrapper[4749]: I0310 16:44:02.459809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" event={"ID":"b94a1492-616a-49e7-818e-9fc0533f91ce","Type":"ContainerDied","Data":"a19acd38052fc166a2850d7b7b5969ebd89a681208074c7dc70a0b2ba1337ed3"} Mar 10 16:44:03 crc kubenswrapper[4749]: I0310 16:44:03.611307 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:44:03 crc kubenswrapper[4749]: E0310 16:44:03.612021 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:44:03 crc kubenswrapper[4749]: I0310 16:44:03.697706 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:03 crc kubenswrapper[4749]: I0310 16:44:03.756254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2vs9\" (UniqueName: \"kubernetes.io/projected/b94a1492-616a-49e7-818e-9fc0533f91ce-kube-api-access-z2vs9\") pod \"b94a1492-616a-49e7-818e-9fc0533f91ce\" (UID: \"b94a1492-616a-49e7-818e-9fc0533f91ce\") " Mar 10 16:44:03 crc kubenswrapper[4749]: I0310 16:44:03.762083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94a1492-616a-49e7-818e-9fc0533f91ce-kube-api-access-z2vs9" (OuterVolumeSpecName: "kube-api-access-z2vs9") pod "b94a1492-616a-49e7-818e-9fc0533f91ce" (UID: "b94a1492-616a-49e7-818e-9fc0533f91ce"). InnerVolumeSpecName "kube-api-access-z2vs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:44:03 crc kubenswrapper[4749]: I0310 16:44:03.858279 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2vs9\" (UniqueName: \"kubernetes.io/projected/b94a1492-616a-49e7-818e-9fc0533f91ce-kube-api-access-z2vs9\") on node \"crc\" DevicePath \"\"" Mar 10 16:44:04 crc kubenswrapper[4749]: I0310 16:44:04.476641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" event={"ID":"b94a1492-616a-49e7-818e-9fc0533f91ce","Type":"ContainerDied","Data":"12d7b9139fb70a6952d34fe34ce95fa68043232cc50578438940254103ed2af4"} Mar 10 16:44:04 crc kubenswrapper[4749]: I0310 16:44:04.476983 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d7b9139fb70a6952d34fe34ce95fa68043232cc50578438940254103ed2af4" Mar 10 16:44:04 crc kubenswrapper[4749]: I0310 16:44:04.476793 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552684-5pt8r" Mar 10 16:44:04 crc kubenswrapper[4749]: I0310 16:44:04.768876 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552678-x7mlg"] Mar 10 16:44:04 crc kubenswrapper[4749]: I0310 16:44:04.791388 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552678-x7mlg"] Mar 10 16:44:05 crc kubenswrapper[4749]: I0310 16:44:05.616605 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392fc373-288c-4653-9ea8-8bd54d5deac2" path="/var/lib/kubelet/pods/392fc373-288c-4653-9ea8-8bd54d5deac2/volumes" Mar 10 16:44:18 crc kubenswrapper[4749]: I0310 16:44:18.606537 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:44:18 crc kubenswrapper[4749]: E0310 16:44:18.607271 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:44:25 crc kubenswrapper[4749]: I0310 16:44:25.343314 4749 scope.go:117] "RemoveContainer" containerID="9a9eed78cf502edd34f2d4b1faae36ed06f5d98e6ed03e6e4a641f8cc56af325" Mar 10 16:44:33 crc kubenswrapper[4749]: I0310 16:44:33.610101 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:44:33 crc kubenswrapper[4749]: E0310 16:44:33.610900 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:44:47 crc kubenswrapper[4749]: I0310 16:44:47.607193 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:44:47 crc kubenswrapper[4749]: E0310 16:44:47.607864 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.112924 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vztlz"] Mar 10 16:44:53 crc kubenswrapper[4749]: E0310 16:44:53.113887 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94a1492-616a-49e7-818e-9fc0533f91ce" containerName="oc" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.113905 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94a1492-616a-49e7-818e-9fc0533f91ce" containerName="oc" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.114082 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94a1492-616a-49e7-818e-9fc0533f91ce" containerName="oc" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.115307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.126624 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vztlz"] Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.166712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-catalog-content\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.166780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvp6c\" (UniqueName: \"kubernetes.io/projected/7ab0f15c-784b-4594-8268-1ec9edd5c06d-kube-api-access-qvp6c\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.166880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-utilities\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.267928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-catalog-content\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.268286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvp6c\" (UniqueName: \"kubernetes.io/projected/7ab0f15c-784b-4594-8268-1ec9edd5c06d-kube-api-access-qvp6c\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.268481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-utilities\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.268572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-catalog-content\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.269000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-utilities\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.291223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvp6c\" (UniqueName: \"kubernetes.io/projected/7ab0f15c-784b-4594-8268-1ec9edd5c06d-kube-api-access-qvp6c\") pod \"redhat-operators-vztlz\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.450257 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:44:53 crc kubenswrapper[4749]: I0310 16:44:53.870288 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vztlz"] Mar 10 16:44:54 crc kubenswrapper[4749]: I0310 16:44:54.810526 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerID="f880e5fb7d6a9906059fa2b39d967ed2f17b355f8a50ae10a99500711fa87743" exitCode=0 Mar 10 16:44:54 crc kubenswrapper[4749]: I0310 16:44:54.810613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerDied","Data":"f880e5fb7d6a9906059fa2b39d967ed2f17b355f8a50ae10a99500711fa87743"} Mar 10 16:44:54 crc kubenswrapper[4749]: I0310 16:44:54.810840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerStarted","Data":"f649bd83aef297ef556ed15c83f6fff6d3f341ac059ecbbd78c4420cd34af54c"} Mar 10 16:44:55 crc kubenswrapper[4749]: I0310 16:44:55.818511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerStarted","Data":"77a5d7176f8a08c9630d39ca8c4649920c04bfd8b1d7fecf991b1b4a12f5a258"} Mar 10 16:44:56 crc kubenswrapper[4749]: I0310 16:44:56.827417 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerID="77a5d7176f8a08c9630d39ca8c4649920c04bfd8b1d7fecf991b1b4a12f5a258" exitCode=0 Mar 10 16:44:56 crc kubenswrapper[4749]: I0310 16:44:56.827470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerDied","Data":"77a5d7176f8a08c9630d39ca8c4649920c04bfd8b1d7fecf991b1b4a12f5a258"} Mar 10 16:44:57 crc kubenswrapper[4749]: I0310 16:44:57.836017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerStarted","Data":"9ed918595bf3b914b702b01045994c4a6a3cca95849725b07f151e559ce794fc"} Mar 10 16:44:57 crc kubenswrapper[4749]: I0310 16:44:57.856850 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vztlz" podStartSLOduration=2.406763201 podStartE2EDuration="4.856829121s" podCreationTimestamp="2026-03-10 16:44:53 +0000 UTC" firstStartedPulling="2026-03-10 16:44:54.8131852 +0000 UTC m=+3391.935050887" lastFinishedPulling="2026-03-10 16:44:57.26325113 +0000 UTC m=+3394.385116807" observedRunningTime="2026-03-10 16:44:57.853273504 +0000 UTC m=+3394.975139191" watchObservedRunningTime="2026-03-10 16:44:57.856829121 +0000 UTC m=+3394.978694808" Mar 10 16:44:59 crc kubenswrapper[4749]: I0310 16:44:59.606187 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:44:59 crc kubenswrapper[4749]: E0310 16:44:59.607279 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.145943 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx"] Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.146983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.150598 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.150719 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.156657 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx"] Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.256616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kfx2\" (UniqueName: \"kubernetes.io/projected/3745e088-a556-46c2-90cf-468698371ccf-kube-api-access-9kfx2\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.256707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3745e088-a556-46c2-90cf-468698371ccf-config-volume\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.256761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3745e088-a556-46c2-90cf-468698371ccf-secret-volume\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.359348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3745e088-a556-46c2-90cf-468698371ccf-config-volume\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.359477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3745e088-a556-46c2-90cf-468698371ccf-secret-volume\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.359524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kfx2\" (UniqueName: \"kubernetes.io/projected/3745e088-a556-46c2-90cf-468698371ccf-kube-api-access-9kfx2\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.360584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3745e088-a556-46c2-90cf-468698371ccf-config-volume\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.367229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3745e088-a556-46c2-90cf-468698371ccf-secret-volume\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.374976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kfx2\" (UniqueName: \"kubernetes.io/projected/3745e088-a556-46c2-90cf-468698371ccf-kube-api-access-9kfx2\") pod \"collect-profiles-29552685-kgjgx\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.473698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:00 crc kubenswrapper[4749]: I0310 16:45:00.904623 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx"] Mar 10 16:45:00 crc kubenswrapper[4749]: W0310 16:45:00.911081 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3745e088_a556_46c2_90cf_468698371ccf.slice/crio-7101a691b11eec5307376d46be66cd5f90c43586e54cd00cc80f0786244c8cb6 WatchSource:0}: Error finding container 7101a691b11eec5307376d46be66cd5f90c43586e54cd00cc80f0786244c8cb6: Status 404 returned error can't find the container with id 7101a691b11eec5307376d46be66cd5f90c43586e54cd00cc80f0786244c8cb6 Mar 10 16:45:01 crc kubenswrapper[4749]: I0310 16:45:01.863920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" event={"ID":"3745e088-a556-46c2-90cf-468698371ccf","Type":"ContainerStarted","Data":"1cc09adc0ba0684fe95206d91b748578f4bc68cc4da4515669c1d91569420ba6"} Mar 10 16:45:01 crc kubenswrapper[4749]: I0310 16:45:01.864203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" event={"ID":"3745e088-a556-46c2-90cf-468698371ccf","Type":"ContainerStarted","Data":"7101a691b11eec5307376d46be66cd5f90c43586e54cd00cc80f0786244c8cb6"} Mar 10 16:45:02 crc kubenswrapper[4749]: I0310 16:45:02.884742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" podStartSLOduration=2.8847251 podStartE2EDuration="2.8847251s" podCreationTimestamp="2026-03-10 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:45:02.880454403 +0000 UTC m=+3400.002320090" watchObservedRunningTime="2026-03-10 16:45:02.8847251 +0000 UTC m=+3400.006590787" Mar 10 16:45:03 crc kubenswrapper[4749]: I0310 16:45:03.450447 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:45:03 crc kubenswrapper[4749]: I0310 16:45:03.450872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:45:03 crc kubenswrapper[4749]: I0310 16:45:03.495267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:45:03 crc kubenswrapper[4749]: I0310 16:45:03.879642 4749 generic.go:334] "Generic (PLEG): container finished" podID="3745e088-a556-46c2-90cf-468698371ccf" containerID="1cc09adc0ba0684fe95206d91b748578f4bc68cc4da4515669c1d91569420ba6" exitCode=0 Mar 10 16:45:03 crc kubenswrapper[4749]: I0310 16:45:03.879748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" event={"ID":"3745e088-a556-46c2-90cf-468698371ccf","Type":"ContainerDied","Data":"1cc09adc0ba0684fe95206d91b748578f4bc68cc4da4515669c1d91569420ba6"} Mar 10 16:45:03 crc kubenswrapper[4749]: I0310 16:45:03.932660 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.235600 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.342282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3745e088-a556-46c2-90cf-468698371ccf-config-volume\") pod \"3745e088-a556-46c2-90cf-468698371ccf\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.342663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3745e088-a556-46c2-90cf-468698371ccf-secret-volume\") pod \"3745e088-a556-46c2-90cf-468698371ccf\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.342723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kfx2\" (UniqueName: \"kubernetes.io/projected/3745e088-a556-46c2-90cf-468698371ccf-kube-api-access-9kfx2\") pod \"3745e088-a556-46c2-90cf-468698371ccf\" (UID: \"3745e088-a556-46c2-90cf-468698371ccf\") " Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.343099 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3745e088-a556-46c2-90cf-468698371ccf-config-volume" (OuterVolumeSpecName: "config-volume") pod "3745e088-a556-46c2-90cf-468698371ccf" (UID: "3745e088-a556-46c2-90cf-468698371ccf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.356576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3745e088-a556-46c2-90cf-468698371ccf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3745e088-a556-46c2-90cf-468698371ccf" (UID: "3745e088-a556-46c2-90cf-468698371ccf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.373683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3745e088-a556-46c2-90cf-468698371ccf-kube-api-access-9kfx2" (OuterVolumeSpecName: "kube-api-access-9kfx2") pod "3745e088-a556-46c2-90cf-468698371ccf" (UID: "3745e088-a556-46c2-90cf-468698371ccf"). InnerVolumeSpecName "kube-api-access-9kfx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.444004 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3745e088-a556-46c2-90cf-468698371ccf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.444043 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3745e088-a556-46c2-90cf-468698371ccf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.444053 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kfx2\" (UniqueName: \"kubernetes.io/projected/3745e088-a556-46c2-90cf-468698371ccf-kube-api-access-9kfx2\") on node \"crc\" DevicePath \"\"" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.753018 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vztlz"] Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.895532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" event={"ID":"3745e088-a556-46c2-90cf-468698371ccf","Type":"ContainerDied","Data":"7101a691b11eec5307376d46be66cd5f90c43586e54cd00cc80f0786244c8cb6"} Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.895580 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7101a691b11eec5307376d46be66cd5f90c43586e54cd00cc80f0786244c8cb6" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.895546 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx" Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.958952 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n"] Mar 10 16:45:05 crc kubenswrapper[4749]: I0310 16:45:05.965774 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-4sg2n"] Mar 10 16:45:06 crc kubenswrapper[4749]: I0310 16:45:06.904329 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vztlz" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="registry-server" containerID="cri-o://9ed918595bf3b914b702b01045994c4a6a3cca95849725b07f151e559ce794fc" gracePeriod=2 Mar 10 16:45:07 crc kubenswrapper[4749]: I0310 16:45:07.618132 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f3b22b-18d6-498a-a003-ffa4c40f362e" path="/var/lib/kubelet/pods/90f3b22b-18d6-498a-a003-ffa4c40f362e/volumes" Mar 10 16:45:07 crc kubenswrapper[4749]: I0310 16:45:07.912861 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerID="9ed918595bf3b914b702b01045994c4a6a3cca95849725b07f151e559ce794fc" exitCode=0 Mar 10 16:45:07 crc kubenswrapper[4749]: I0310 16:45:07.912923 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerDied","Data":"9ed918595bf3b914b702b01045994c4a6a3cca95849725b07f151e559ce794fc"} Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.426755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.588466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-catalog-content\") pod \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.588612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvp6c\" (UniqueName: \"kubernetes.io/projected/7ab0f15c-784b-4594-8268-1ec9edd5c06d-kube-api-access-qvp6c\") pod \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.588637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-utilities\") pod \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\" (UID: \"7ab0f15c-784b-4594-8268-1ec9edd5c06d\") " Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.589658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-utilities" (OuterVolumeSpecName: "utilities") pod "7ab0f15c-784b-4594-8268-1ec9edd5c06d" (UID: "7ab0f15c-784b-4594-8268-1ec9edd5c06d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.601842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab0f15c-784b-4594-8268-1ec9edd5c06d-kube-api-access-qvp6c" (OuterVolumeSpecName: "kube-api-access-qvp6c") pod "7ab0f15c-784b-4594-8268-1ec9edd5c06d" (UID: "7ab0f15c-784b-4594-8268-1ec9edd5c06d"). InnerVolumeSpecName "kube-api-access-qvp6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.690201 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvp6c\" (UniqueName: \"kubernetes.io/projected/7ab0f15c-784b-4594-8268-1ec9edd5c06d-kube-api-access-qvp6c\") on node \"crc\" DevicePath \"\"" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.690236 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.722221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ab0f15c-784b-4594-8268-1ec9edd5c06d" (UID: "7ab0f15c-784b-4594-8268-1ec9edd5c06d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.792108 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ab0f15c-784b-4594-8268-1ec9edd5c06d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.927044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vztlz" event={"ID":"7ab0f15c-784b-4594-8268-1ec9edd5c06d","Type":"ContainerDied","Data":"f649bd83aef297ef556ed15c83f6fff6d3f341ac059ecbbd78c4420cd34af54c"} Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.927097 4749 scope.go:117] "RemoveContainer" containerID="9ed918595bf3b914b702b01045994c4a6a3cca95849725b07f151e559ce794fc" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.927099 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vztlz" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.945818 4749 scope.go:117] "RemoveContainer" containerID="77a5d7176f8a08c9630d39ca8c4649920c04bfd8b1d7fecf991b1b4a12f5a258" Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.957711 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vztlz"] Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.962382 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vztlz"] Mar 10 16:45:08 crc kubenswrapper[4749]: I0310 16:45:08.985053 4749 scope.go:117] "RemoveContainer" containerID="f880e5fb7d6a9906059fa2b39d967ed2f17b355f8a50ae10a99500711fa87743" Mar 10 16:45:09 crc kubenswrapper[4749]: I0310 16:45:09.615986 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" path="/var/lib/kubelet/pods/7ab0f15c-784b-4594-8268-1ec9edd5c06d/volumes" Mar 10 16:45:10 crc kubenswrapper[4749]: I0310 16:45:10.606927 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:45:10 crc kubenswrapper[4749]: E0310 16:45:10.607174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:45:21 crc kubenswrapper[4749]: I0310 16:45:21.607166 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:45:21 crc kubenswrapper[4749]: E0310 16:45:21.608270 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:45:25 crc kubenswrapper[4749]: I0310 16:45:25.407825 4749 scope.go:117] "RemoveContainer" containerID="096d014e267ba0f0ea011913fed5890b35fa832d09571c15acf58fd842b7bb79" Mar 10 16:45:32 crc kubenswrapper[4749]: I0310 16:45:32.606685 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:45:32 crc kubenswrapper[4749]: E0310 16:45:32.607482 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:45:46 crc kubenswrapper[4749]: I0310 16:45:46.607789 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:45:46 crc kubenswrapper[4749]: E0310 16:45:46.609556 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.141611 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552686-5856b"] Mar 10 16:46:00 crc kubenswrapper[4749]: E0310 16:46:00.142642 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3745e088-a556-46c2-90cf-468698371ccf" containerName="collect-profiles" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.142664 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3745e088-a556-46c2-90cf-468698371ccf" containerName="collect-profiles" Mar 10 16:46:00 crc kubenswrapper[4749]: E0310 16:46:00.142685 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="extract-utilities" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.142696 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="extract-utilities" Mar 10 16:46:00 crc kubenswrapper[4749]: E0310 16:46:00.142711 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="registry-server" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.142724 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="registry-server" Mar 10 16:46:00 crc kubenswrapper[4749]: E0310 16:46:00.142745 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="extract-content" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.142753 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="extract-content" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.142930 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3745e088-a556-46c2-90cf-468698371ccf" containerName="collect-profiles" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.142959 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab0f15c-784b-4594-8268-1ec9edd5c06d" containerName="registry-server" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.143558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.150242 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552686-5856b"] Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.151050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.151104 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.151345 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.240749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm6t\" (UniqueName: \"kubernetes.io/projected/a0d8c93e-907c-40c3-9b96-efdddddf7f41-kube-api-access-fmm6t\") pod \"auto-csr-approver-29552686-5856b\" (UID: \"a0d8c93e-907c-40c3-9b96-efdddddf7f41\") " pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.341814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm6t\" (UniqueName: \"kubernetes.io/projected/a0d8c93e-907c-40c3-9b96-efdddddf7f41-kube-api-access-fmm6t\") pod \"auto-csr-approver-29552686-5856b\" (UID: \"a0d8c93e-907c-40c3-9b96-efdddddf7f41\") " pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.362615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm6t\" (UniqueName: \"kubernetes.io/projected/a0d8c93e-907c-40c3-9b96-efdddddf7f41-kube-api-access-fmm6t\") pod \"auto-csr-approver-29552686-5856b\" (UID: \"a0d8c93e-907c-40c3-9b96-efdddddf7f41\") " pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.462559 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:00 crc kubenswrapper[4749]: I0310 16:46:00.606525 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:46:00 crc kubenswrapper[4749]: E0310 16:46:00.606887 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:46:01 crc kubenswrapper[4749]: I0310 16:46:00.684568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552686-5856b"] Mar 10 16:46:01 crc kubenswrapper[4749]: I0310 16:46:01.347304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552686-5856b" event={"ID":"a0d8c93e-907c-40c3-9b96-efdddddf7f41","Type":"ContainerStarted","Data":"3c6c5a4def5d001736875e2251bfb8d0ef79c372b8abd53ec819f34967a13ceb"} Mar 10 16:46:02 crc kubenswrapper[4749]: I0310 16:46:02.359368 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0d8c93e-907c-40c3-9b96-efdddddf7f41" containerID="7d392fe96182774bb7ab3025c83fe18cec6f3ea49c09acce9437864ac4a11a64" exitCode=0 Mar 10 16:46:02 crc kubenswrapper[4749]: I0310 16:46:02.359482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552686-5856b" event={"ID":"a0d8c93e-907c-40c3-9b96-efdddddf7f41","Type":"ContainerDied","Data":"7d392fe96182774bb7ab3025c83fe18cec6f3ea49c09acce9437864ac4a11a64"} Mar 10 16:46:03 crc kubenswrapper[4749]: I0310 16:46:03.708092 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:03 crc kubenswrapper[4749]: I0310 16:46:03.896941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmm6t\" (UniqueName: \"kubernetes.io/projected/a0d8c93e-907c-40c3-9b96-efdddddf7f41-kube-api-access-fmm6t\") pod \"a0d8c93e-907c-40c3-9b96-efdddddf7f41\" (UID: \"a0d8c93e-907c-40c3-9b96-efdddddf7f41\") " Mar 10 16:46:03 crc kubenswrapper[4749]: I0310 16:46:03.902354 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d8c93e-907c-40c3-9b96-efdddddf7f41-kube-api-access-fmm6t" (OuterVolumeSpecName: "kube-api-access-fmm6t") pod "a0d8c93e-907c-40c3-9b96-efdddddf7f41" (UID: "a0d8c93e-907c-40c3-9b96-efdddddf7f41"). InnerVolumeSpecName "kube-api-access-fmm6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:46:03 crc kubenswrapper[4749]: I0310 16:46:03.999595 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmm6t\" (UniqueName: \"kubernetes.io/projected/a0d8c93e-907c-40c3-9b96-efdddddf7f41-kube-api-access-fmm6t\") on node \"crc\" DevicePath \"\"" Mar 10 16:46:04 crc kubenswrapper[4749]: I0310 16:46:04.377672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552686-5856b" event={"ID":"a0d8c93e-907c-40c3-9b96-efdddddf7f41","Type":"ContainerDied","Data":"3c6c5a4def5d001736875e2251bfb8d0ef79c372b8abd53ec819f34967a13ceb"} Mar 10 16:46:04 crc kubenswrapper[4749]: I0310 16:46:04.377741 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6c5a4def5d001736875e2251bfb8d0ef79c372b8abd53ec819f34967a13ceb" Mar 10 16:46:04 crc kubenswrapper[4749]: I0310 16:46:04.377822 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552686-5856b" Mar 10 16:46:04 crc kubenswrapper[4749]: I0310 16:46:04.789050 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552680-krnvg"] Mar 10 16:46:04 crc kubenswrapper[4749]: I0310 16:46:04.793669 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552680-krnvg"] Mar 10 16:46:05 crc kubenswrapper[4749]: I0310 16:46:05.615900 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="224459f3-ed20-441b-9a6a-1a44f29c02ba" path="/var/lib/kubelet/pods/224459f3-ed20-441b-9a6a-1a44f29c02ba/volumes" Mar 10 16:46:14 crc kubenswrapper[4749]: I0310 16:46:14.606355 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:46:14 crc kubenswrapper[4749]: E0310 16:46:14.607081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:46:25 crc kubenswrapper[4749]: I0310 16:46:25.478348 4749 scope.go:117] "RemoveContainer" containerID="ba8bc40e74c59d2cc35ba98df4a7c80dcc8e460d0d0262fe67c001484432df17" Mar 10 16:46:26 crc kubenswrapper[4749]: I0310 16:46:26.607331 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:46:26 crc kubenswrapper[4749]: E0310 16:46:26.608306 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:46:40 crc kubenswrapper[4749]: I0310 16:46:40.606950 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:46:40 crc kubenswrapper[4749]: E0310 16:46:40.607976 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:46:54 crc kubenswrapper[4749]: I0310 16:46:54.606828 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:46:55 crc kubenswrapper[4749]: I0310 16:46:55.762794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"74293f44aec15e963c26cf00402fbd62f5c1d2c0bf3a5e91e07dee6478a803b1"} Mar 10 16:47:17 crc kubenswrapper[4749]: I0310 16:47:17.496633 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-5r86d" podUID="b4cb9d6b-00f0-478e-a275-2720e6f90e8a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.80:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 16:47:17 crc kubenswrapper[4749]: I0310 16:47:17.660472 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-q5cvw" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="registry-server" probeResult="failure" output=< Mar 10 16:47:17 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 16:47:17 crc kubenswrapper[4749]: > Mar 10 16:47:17 crc kubenswrapper[4749]: I0310 16:47:17.672062 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-wxr96" podUID="61ae455d-1747-4883-b19d-3cbe4aa77dcd" containerName="registry-server" probeResult="failure" output=< Mar 10 16:47:17 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 16:47:17 crc kubenswrapper[4749]: > Mar 10 16:47:17 crc kubenswrapper[4749]: I0310 16:47:17.678343 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-q5cvw" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="registry-server" probeResult="failure" output=< Mar 10 16:47:17 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 16:47:17 crc kubenswrapper[4749]: > Mar 10 16:47:17 crc kubenswrapper[4749]: I0310 16:47:17.679199 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-wxr96" podUID="61ae455d-1747-4883-b19d-3cbe4aa77dcd" containerName="registry-server" probeResult="failure" output=< Mar 10 16:47:17 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 16:47:17 crc kubenswrapper[4749]: > Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.811485 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7khql"] Mar 10 16:47:19 crc kubenswrapper[4749]: E0310 16:47:19.811856 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d8c93e-907c-40c3-9b96-efdddddf7f41" containerName="oc" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.811875 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d8c93e-907c-40c3-9b96-efdddddf7f41" containerName="oc" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.812094 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d8c93e-907c-40c3-9b96-efdddddf7f41" containerName="oc" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.815489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.835263 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7khql"] Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.848284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-catalog-content\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.848358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-utilities\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.848467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhddj\" (UniqueName: \"kubernetes.io/projected/8eafec51-f1a2-4ec0-ad36-3c62d8169922-kube-api-access-lhddj\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.971979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-utilities\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.972101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhddj\" (UniqueName: \"kubernetes.io/projected/8eafec51-f1a2-4ec0-ad36-3c62d8169922-kube-api-access-lhddj\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.972196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-catalog-content\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.972653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-utilities\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:19 crc kubenswrapper[4749]: I0310 16:47:19.972681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-catalog-content\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:20 crc kubenswrapper[4749]: I0310 16:47:20.011678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhddj\" (UniqueName: \"kubernetes.io/projected/8eafec51-f1a2-4ec0-ad36-3c62d8169922-kube-api-access-lhddj\") pod \"redhat-marketplace-7khql\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:20 crc kubenswrapper[4749]: I0310 16:47:20.146356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:20 crc kubenswrapper[4749]: I0310 16:47:20.654611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7khql"] Mar 10 16:47:20 crc kubenswrapper[4749]: I0310 16:47:20.985006 4749 generic.go:334] "Generic (PLEG): container finished" podID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerID="ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c" exitCode=0 Mar 10 16:47:20 crc kubenswrapper[4749]: I0310 16:47:20.985048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7khql" event={"ID":"8eafec51-f1a2-4ec0-ad36-3c62d8169922","Type":"ContainerDied","Data":"ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c"} Mar 10 16:47:20 crc kubenswrapper[4749]: I0310 16:47:20.985073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7khql" event={"ID":"8eafec51-f1a2-4ec0-ad36-3c62d8169922","Type":"ContainerStarted","Data":"c05e09879f917ae2dffb8fff161633e0258c04aa33a185ad39b89f20c83f35c5"} Mar 10 16:47:23 crc kubenswrapper[4749]: I0310 16:47:23.000790 4749 generic.go:334] "Generic (PLEG): container finished" podID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerID="2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780" exitCode=0 Mar 10 16:47:23 crc kubenswrapper[4749]: I0310 16:47:23.000855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7khql" event={"ID":"8eafec51-f1a2-4ec0-ad36-3c62d8169922","Type":"ContainerDied","Data":"2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780"} Mar 10 16:47:25 crc kubenswrapper[4749]: I0310 16:47:25.024116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7khql" event={"ID":"8eafec51-f1a2-4ec0-ad36-3c62d8169922","Type":"ContainerStarted","Data":"b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb"} Mar 10 16:47:25 crc kubenswrapper[4749]: I0310 16:47:25.047899 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7khql" podStartSLOduration=2.98096045 podStartE2EDuration="6.047879457s" podCreationTimestamp="2026-03-10 16:47:19 +0000 UTC" firstStartedPulling="2026-03-10 16:47:20.986589322 +0000 UTC m=+3538.108455009" lastFinishedPulling="2026-03-10 16:47:24.053508329 +0000 UTC m=+3541.175374016" observedRunningTime="2026-03-10 16:47:25.044565447 +0000 UTC m=+3542.166431154" watchObservedRunningTime="2026-03-10 16:47:25.047879457 +0000 UTC m=+3542.169745154" Mar 10 16:47:30 crc kubenswrapper[4749]: I0310 16:47:30.148064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:30 crc kubenswrapper[4749]: I0310 16:47:30.149461 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:30 crc kubenswrapper[4749]: I0310 16:47:30.201810 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:31 crc kubenswrapper[4749]: I0310 16:47:31.104641 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:31 crc kubenswrapper[4749]: I0310 16:47:31.143285 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7khql"] Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.074145 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7khql" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="registry-server" containerID="cri-o://b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb" gracePeriod=2 Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.422507 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.491721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-catalog-content\") pod \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.491861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhddj\" (UniqueName: \"kubernetes.io/projected/8eafec51-f1a2-4ec0-ad36-3c62d8169922-kube-api-access-lhddj\") pod \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.491903 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-utilities\") pod \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\" (UID: \"8eafec51-f1a2-4ec0-ad36-3c62d8169922\") " Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.493215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-utilities" (OuterVolumeSpecName: "utilities") pod "8eafec51-f1a2-4ec0-ad36-3c62d8169922" (UID: "8eafec51-f1a2-4ec0-ad36-3c62d8169922"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.500058 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eafec51-f1a2-4ec0-ad36-3c62d8169922-kube-api-access-lhddj" (OuterVolumeSpecName: "kube-api-access-lhddj") pod "8eafec51-f1a2-4ec0-ad36-3c62d8169922" (UID: "8eafec51-f1a2-4ec0-ad36-3c62d8169922"). InnerVolumeSpecName "kube-api-access-lhddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.519413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eafec51-f1a2-4ec0-ad36-3c62d8169922" (UID: "8eafec51-f1a2-4ec0-ad36-3c62d8169922"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.593047 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhddj\" (UniqueName: \"kubernetes.io/projected/8eafec51-f1a2-4ec0-ad36-3c62d8169922-kube-api-access-lhddj\") on node \"crc\" DevicePath \"\"" Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.593302 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:47:33 crc kubenswrapper[4749]: I0310 16:47:33.593401 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eafec51-f1a2-4ec0-ad36-3c62d8169922-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.084807 4749 generic.go:334] "Generic (PLEG): container finished" podID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerID="b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb" exitCode=0 Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.084862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7khql" event={"ID":"8eafec51-f1a2-4ec0-ad36-3c62d8169922","Type":"ContainerDied","Data":"b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb"} Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.084902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7khql" event={"ID":"8eafec51-f1a2-4ec0-ad36-3c62d8169922","Type":"ContainerDied","Data":"c05e09879f917ae2dffb8fff161633e0258c04aa33a185ad39b89f20c83f35c5"} Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.084925 4749 scope.go:117] "RemoveContainer" containerID="b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.084944 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7khql" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.106064 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7khql"] Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.113645 4749 scope.go:117] "RemoveContainer" containerID="2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.116604 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7khql"] Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.131862 4749 scope.go:117] "RemoveContainer" containerID="ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.165171 4749 scope.go:117] "RemoveContainer" containerID="b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb" Mar 10 16:47:34 crc kubenswrapper[4749]: E0310 16:47:34.165777 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb\": container with ID starting with b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb not found: ID does not exist" containerID="b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.165818 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb"} err="failed to get container status \"b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb\": rpc error: code = NotFound desc = could not find container \"b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb\": container with ID starting with b02a427d7a59a25d4ba4ae9307db1408489396b92bb850a69e68b24e397e49fb not found: ID does not exist" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.165845 4749 scope.go:117] "RemoveContainer" containerID="2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780" Mar 10 16:47:34 crc kubenswrapper[4749]: E0310 16:47:34.166193 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780\": container with ID starting with 2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780 not found: ID does not exist" containerID="2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.166226 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780"} err="failed to get container status \"2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780\": rpc error: code = NotFound desc = could not find container \"2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780\": container with ID starting with 2e1c7aac378981e3dec030ea60f04f0f854247c44f85942607f27475fa6ff780 not found: ID does not exist" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.166245 4749 scope.go:117] "RemoveContainer" containerID="ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c" Mar 10 16:47:34 crc kubenswrapper[4749]: E0310 16:47:34.166543 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c\": container with ID starting with ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c not found: ID does not exist" containerID="ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c" Mar 10 16:47:34 crc kubenswrapper[4749]: I0310 16:47:34.166575 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c"} err="failed to get container status \"ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c\": rpc error: code = NotFound desc = could not find container \"ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c\": container with ID starting with ccd8ce0ba05cd4b1ffe1d7ec0655ef42e2f44eb89f5d69e3576ad00a3121899c not found: ID does not exist" Mar 10 16:47:35 crc kubenswrapper[4749]: I0310 16:47:35.615999 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" path="/var/lib/kubelet/pods/8eafec51-f1a2-4ec0-ad36-3c62d8169922/volumes" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.143221 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552688-r4bj7"] Mar 10 16:48:00 crc kubenswrapper[4749]: E0310 16:48:00.145495 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="registry-server" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.145602 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="registry-server" Mar 10 16:48:00 crc kubenswrapper[4749]: E0310 16:48:00.145708 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="extract-content" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.145781 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="extract-content" Mar 10 16:48:00 crc kubenswrapper[4749]: E0310 16:48:00.145862 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="extract-utilities" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.145942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="extract-utilities" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.160831 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eafec51-f1a2-4ec0-ad36-3c62d8169922" containerName="registry-server" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.161952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552688-r4bj7"] Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.162065 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.164959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.165580 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.166860 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.282830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjnk2\" (UniqueName: \"kubernetes.io/projected/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3-kube-api-access-zjnk2\") pod \"auto-csr-approver-29552688-r4bj7\" (UID: \"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3\") " pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.384101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjnk2\" (UniqueName: \"kubernetes.io/projected/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3-kube-api-access-zjnk2\") pod \"auto-csr-approver-29552688-r4bj7\" (UID: \"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3\") " pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.406030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjnk2\" (UniqueName: \"kubernetes.io/projected/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3-kube-api-access-zjnk2\") pod \"auto-csr-approver-29552688-r4bj7\" (UID: \"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3\") " pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.483055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.943583 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552688-r4bj7"] Mar 10 16:48:00 crc kubenswrapper[4749]: I0310 16:48:00.946204 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:48:01 crc kubenswrapper[4749]: I0310 16:48:01.279167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" event={"ID":"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3","Type":"ContainerStarted","Data":"b76f3348f4efddff9f9faa1f7f28604512cae7cb9c37a65ead79d5748b5b5973"} Mar 10 16:48:03 crc kubenswrapper[4749]: I0310 16:48:03.294531 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db2d468-b4bb-4df7-b02c-26c2c0dad9f3" containerID="5c6d9540fb87ee5c46f7d5c91f937bfeb2c646574029c08739852705606a60b8" exitCode=0 Mar 10 16:48:03 crc kubenswrapper[4749]: I0310 16:48:03.294590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" event={"ID":"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3","Type":"ContainerDied","Data":"5c6d9540fb87ee5c46f7d5c91f937bfeb2c646574029c08739852705606a60b8"} Mar 10 16:48:04 crc kubenswrapper[4749]: I0310 16:48:04.560162 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:04 crc kubenswrapper[4749]: I0310 16:48:04.648643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjnk2\" (UniqueName: \"kubernetes.io/projected/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3-kube-api-access-zjnk2\") pod \"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3\" (UID: \"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3\") " Mar 10 16:48:04 crc kubenswrapper[4749]: I0310 16:48:04.653858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3-kube-api-access-zjnk2" (OuterVolumeSpecName: "kube-api-access-zjnk2") pod "8db2d468-b4bb-4df7-b02c-26c2c0dad9f3" (UID: "8db2d468-b4bb-4df7-b02c-26c2c0dad9f3"). InnerVolumeSpecName "kube-api-access-zjnk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:48:04 crc kubenswrapper[4749]: I0310 16:48:04.750649 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjnk2\" (UniqueName: \"kubernetes.io/projected/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3-kube-api-access-zjnk2\") on node \"crc\" DevicePath \"\"" Mar 10 16:48:05 crc kubenswrapper[4749]: I0310 16:48:05.328741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" event={"ID":"8db2d468-b4bb-4df7-b02c-26c2c0dad9f3","Type":"ContainerDied","Data":"b76f3348f4efddff9f9faa1f7f28604512cae7cb9c37a65ead79d5748b5b5973"} Mar 10 16:48:05 crc kubenswrapper[4749]: I0310 16:48:05.329096 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b76f3348f4efddff9f9faa1f7f28604512cae7cb9c37a65ead79d5748b5b5973" Mar 10 16:48:05 crc kubenswrapper[4749]: I0310 16:48:05.328787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552688-r4bj7" Mar 10 16:48:05 crc kubenswrapper[4749]: I0310 16:48:05.625157 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552682-vwjmc"] Mar 10 16:48:05 crc kubenswrapper[4749]: I0310 16:48:05.629169 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552682-vwjmc"] Mar 10 16:48:07 crc kubenswrapper[4749]: I0310 16:48:07.620789 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6be2358-2dde-4390-aea2-bae8b848fa78" path="/var/lib/kubelet/pods/c6be2358-2dde-4390-aea2-bae8b848fa78/volumes" Mar 10 16:48:25 crc kubenswrapper[4749]: I0310 16:48:25.576064 4749 scope.go:117] "RemoveContainer" containerID="cda9e59371b7d15e8dca49490434a191fc9f7846103e774702b1068a768d59c4" Mar 10 16:49:20 crc kubenswrapper[4749]: I0310 16:49:20.980657 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:49:20 crc kubenswrapper[4749]: I0310 16:49:20.981270 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:49:50 crc kubenswrapper[4749]: I0310 16:49:50.980803 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:49:50 crc kubenswrapper[4749]: I0310 16:49:50.981510 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.149668 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552690-v6sjh"] Mar 10 16:50:00 crc kubenswrapper[4749]: E0310 16:50:00.150562 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db2d468-b4bb-4df7-b02c-26c2c0dad9f3" containerName="oc" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.150579 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db2d468-b4bb-4df7-b02c-26c2c0dad9f3" containerName="oc" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.150782 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db2d468-b4bb-4df7-b02c-26c2c0dad9f3" containerName="oc" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.151289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.154222 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.154274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.154483 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.159468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552690-v6sjh"] Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.310620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9wxq\" (UniqueName: \"kubernetes.io/projected/700bba28-9400-4971-a767-0b7012b59a1d-kube-api-access-r9wxq\") pod \"auto-csr-approver-29552690-v6sjh\" (UID: \"700bba28-9400-4971-a767-0b7012b59a1d\") " pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.412574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9wxq\" (UniqueName: \"kubernetes.io/projected/700bba28-9400-4971-a767-0b7012b59a1d-kube-api-access-r9wxq\") pod \"auto-csr-approver-29552690-v6sjh\" (UID: \"700bba28-9400-4971-a767-0b7012b59a1d\") " pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.443335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9wxq\" (UniqueName: \"kubernetes.io/projected/700bba28-9400-4971-a767-0b7012b59a1d-kube-api-access-r9wxq\") pod \"auto-csr-approver-29552690-v6sjh\" (UID: \"700bba28-9400-4971-a767-0b7012b59a1d\") " pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.481204 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:00 crc kubenswrapper[4749]: I0310 16:50:00.919803 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552690-v6sjh"] Mar 10 16:50:01 crc kubenswrapper[4749]: I0310 16:50:01.168904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" event={"ID":"700bba28-9400-4971-a767-0b7012b59a1d","Type":"ContainerStarted","Data":"d929a7e34a5f410443660e3de615a2aa106a3c78493745a803509cb429fdb0d4"} Mar 10 16:50:03 crc kubenswrapper[4749]: I0310 16:50:03.188491 4749 generic.go:334] "Generic (PLEG): container finished" podID="700bba28-9400-4971-a767-0b7012b59a1d" containerID="077a582715f24a86e89cbbf5abbb33a598156559fb021acfd8a37775b3223464" exitCode=0 Mar 10 16:50:03 crc kubenswrapper[4749]: I0310 16:50:03.188560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" event={"ID":"700bba28-9400-4971-a767-0b7012b59a1d","Type":"ContainerDied","Data":"077a582715f24a86e89cbbf5abbb33a598156559fb021acfd8a37775b3223464"} Mar 10 16:50:04 crc kubenswrapper[4749]: I0310 16:50:04.461610 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:04 crc kubenswrapper[4749]: I0310 16:50:04.572673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9wxq\" (UniqueName: \"kubernetes.io/projected/700bba28-9400-4971-a767-0b7012b59a1d-kube-api-access-r9wxq\") pod \"700bba28-9400-4971-a767-0b7012b59a1d\" (UID: \"700bba28-9400-4971-a767-0b7012b59a1d\") " Mar 10 16:50:04 crc kubenswrapper[4749]: I0310 16:50:04.578899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700bba28-9400-4971-a767-0b7012b59a1d-kube-api-access-r9wxq" (OuterVolumeSpecName: "kube-api-access-r9wxq") pod "700bba28-9400-4971-a767-0b7012b59a1d" (UID: "700bba28-9400-4971-a767-0b7012b59a1d"). InnerVolumeSpecName "kube-api-access-r9wxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:50:04 crc kubenswrapper[4749]: I0310 16:50:04.674071 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9wxq\" (UniqueName: \"kubernetes.io/projected/700bba28-9400-4971-a767-0b7012b59a1d-kube-api-access-r9wxq\") on node \"crc\" DevicePath \"\"" Mar 10 16:50:05 crc kubenswrapper[4749]: I0310 16:50:05.206113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" event={"ID":"700bba28-9400-4971-a767-0b7012b59a1d","Type":"ContainerDied","Data":"d929a7e34a5f410443660e3de615a2aa106a3c78493745a803509cb429fdb0d4"} Mar 10 16:50:05 crc kubenswrapper[4749]: I0310 16:50:05.206158 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d929a7e34a5f410443660e3de615a2aa106a3c78493745a803509cb429fdb0d4" Mar 10 16:50:05 crc kubenswrapper[4749]: I0310 16:50:05.206165 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552690-v6sjh" Mar 10 16:50:05 crc kubenswrapper[4749]: I0310 16:50:05.528306 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552684-5pt8r"] Mar 10 16:50:05 crc kubenswrapper[4749]: I0310 16:50:05.534524 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552684-5pt8r"] Mar 10 16:50:05 crc kubenswrapper[4749]: I0310 16:50:05.614714 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94a1492-616a-49e7-818e-9fc0533f91ce" path="/var/lib/kubelet/pods/b94a1492-616a-49e7-818e-9fc0533f91ce/volumes" Mar 10 16:50:20 crc kubenswrapper[4749]: I0310 16:50:20.980633 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:50:20 crc kubenswrapper[4749]: I0310 16:50:20.981680 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:50:20 crc kubenswrapper[4749]: I0310 16:50:20.981756 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:50:20 crc kubenswrapper[4749]: I0310 16:50:20.982467 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74293f44aec15e963c26cf00402fbd62f5c1d2c0bf3a5e91e07dee6478a803b1"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:50:20 crc kubenswrapper[4749]: I0310 16:50:20.982530 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://74293f44aec15e963c26cf00402fbd62f5c1d2c0bf3a5e91e07dee6478a803b1" gracePeriod=600 Mar 10 16:50:21 crc kubenswrapper[4749]: I0310 16:50:21.331457 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="74293f44aec15e963c26cf00402fbd62f5c1d2c0bf3a5e91e07dee6478a803b1" exitCode=0 Mar 10 16:50:21 crc kubenswrapper[4749]: I0310 16:50:21.331528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"74293f44aec15e963c26cf00402fbd62f5c1d2c0bf3a5e91e07dee6478a803b1"} Mar 10 16:50:21 crc kubenswrapper[4749]: I0310 16:50:21.331869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b"} Mar 10 16:50:21 crc kubenswrapper[4749]: I0310 16:50:21.331892 4749 scope.go:117] "RemoveContainer" containerID="6be5daff979cea33edbb300f959602c359713805b4f084fc268c2f4e4651c373" Mar 10 16:50:25 crc kubenswrapper[4749]: I0310 16:50:25.673145 4749 scope.go:117] "RemoveContainer" containerID="a19acd38052fc166a2850d7b7b5969ebd89a681208074c7dc70a0b2ba1337ed3" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.089294 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-swv9b"] Mar 10 16:50:48 crc kubenswrapper[4749]: E0310 16:50:48.090075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700bba28-9400-4971-a767-0b7012b59a1d" containerName="oc" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.090089 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="700bba28-9400-4971-a767-0b7012b59a1d" containerName="oc" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.090231 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="700bba28-9400-4971-a767-0b7012b59a1d" containerName="oc" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.091219 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.108650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swv9b"] Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.237581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-catalog-content\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.237728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2nq\" (UniqueName: \"kubernetes.io/projected/4a76b458-802a-4b0a-9848-b02892a20c57-kube-api-access-rm2nq\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.237777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-utilities\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.339290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-catalog-content\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.339419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2nq\" (UniqueName: \"kubernetes.io/projected/4a76b458-802a-4b0a-9848-b02892a20c57-kube-api-access-rm2nq\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.339455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-utilities\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.340452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-catalog-content\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.340488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-utilities\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.357482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2nq\" (UniqueName: \"kubernetes.io/projected/4a76b458-802a-4b0a-9848-b02892a20c57-kube-api-access-rm2nq\") pod \"community-operators-swv9b\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.408225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:48 crc kubenswrapper[4749]: I0310 16:50:48.882258 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-swv9b"] Mar 10 16:50:49 crc kubenswrapper[4749]: I0310 16:50:49.556622 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a76b458-802a-4b0a-9848-b02892a20c57" containerID="cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84" exitCode=0 Mar 10 16:50:49 crc kubenswrapper[4749]: I0310 16:50:49.556974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swv9b" event={"ID":"4a76b458-802a-4b0a-9848-b02892a20c57","Type":"ContainerDied","Data":"cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84"} Mar 10 16:50:49 crc kubenswrapper[4749]: I0310 16:50:49.557031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swv9b" event={"ID":"4a76b458-802a-4b0a-9848-b02892a20c57","Type":"ContainerStarted","Data":"db1fb31ff8bf82d0006be61710fd70aa6fb88a3c7c0dbadf092c14b28b7b0f3c"} Mar 10 16:50:50 crc kubenswrapper[4749]: I0310 16:50:50.564905 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a76b458-802a-4b0a-9848-b02892a20c57" containerID="30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b" exitCode=0 Mar 10 16:50:50 crc kubenswrapper[4749]: I0310 16:50:50.565011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swv9b" event={"ID":"4a76b458-802a-4b0a-9848-b02892a20c57","Type":"ContainerDied","Data":"30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b"} Mar 10 16:50:51 crc kubenswrapper[4749]: I0310 16:50:51.579906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swv9b" event={"ID":"4a76b458-802a-4b0a-9848-b02892a20c57","Type":"ContainerStarted","Data":"1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa"} Mar 10 16:50:51 crc kubenswrapper[4749]: I0310 16:50:51.614029 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-swv9b" podStartSLOduration=2.056106936 podStartE2EDuration="3.614009116s" podCreationTimestamp="2026-03-10 16:50:48 +0000 UTC" firstStartedPulling="2026-03-10 16:50:49.558149577 +0000 UTC m=+3746.680015264" lastFinishedPulling="2026-03-10 16:50:51.116051747 +0000 UTC m=+3748.237917444" observedRunningTime="2026-03-10 16:50:51.603535573 +0000 UTC m=+3748.725401270" watchObservedRunningTime="2026-03-10 16:50:51.614009116 +0000 UTC m=+3748.735874803" Mar 10 16:50:58 crc kubenswrapper[4749]: I0310 16:50:58.408834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:58 crc kubenswrapper[4749]: I0310 16:50:58.409437 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:58 crc kubenswrapper[4749]: I0310 16:50:58.449160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:58 crc kubenswrapper[4749]: I0310 16:50:58.671713 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:50:58 crc kubenswrapper[4749]: I0310 16:50:58.730569 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swv9b"] Mar 10 16:51:00 crc kubenswrapper[4749]: I0310 16:51:00.641166 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-swv9b" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="registry-server" containerID="cri-o://1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa" gracePeriod=2 Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.011442 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.123436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-utilities\") pod \"4a76b458-802a-4b0a-9848-b02892a20c57\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.123554 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-catalog-content\") pod \"4a76b458-802a-4b0a-9848-b02892a20c57\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.123661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm2nq\" (UniqueName: \"kubernetes.io/projected/4a76b458-802a-4b0a-9848-b02892a20c57-kube-api-access-rm2nq\") pod \"4a76b458-802a-4b0a-9848-b02892a20c57\" (UID: \"4a76b458-802a-4b0a-9848-b02892a20c57\") " Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.124723 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-utilities" (OuterVolumeSpecName: "utilities") pod "4a76b458-802a-4b0a-9848-b02892a20c57" (UID: "4a76b458-802a-4b0a-9848-b02892a20c57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.130740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a76b458-802a-4b0a-9848-b02892a20c57-kube-api-access-rm2nq" (OuterVolumeSpecName: "kube-api-access-rm2nq") pod "4a76b458-802a-4b0a-9848-b02892a20c57" (UID: "4a76b458-802a-4b0a-9848-b02892a20c57"). InnerVolumeSpecName "kube-api-access-rm2nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.175795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a76b458-802a-4b0a-9848-b02892a20c57" (UID: "4a76b458-802a-4b0a-9848-b02892a20c57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.224999 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.225041 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a76b458-802a-4b0a-9848-b02892a20c57-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.225055 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm2nq\" (UniqueName: \"kubernetes.io/projected/4a76b458-802a-4b0a-9848-b02892a20c57-kube-api-access-rm2nq\") on node \"crc\" DevicePath \"\"" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.651917 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a76b458-802a-4b0a-9848-b02892a20c57" containerID="1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa" exitCode=0 Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.651956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swv9b" event={"ID":"4a76b458-802a-4b0a-9848-b02892a20c57","Type":"ContainerDied","Data":"1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa"} Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.651981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-swv9b" event={"ID":"4a76b458-802a-4b0a-9848-b02892a20c57","Type":"ContainerDied","Data":"db1fb31ff8bf82d0006be61710fd70aa6fb88a3c7c0dbadf092c14b28b7b0f3c"} Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.652015 4749 scope.go:117] "RemoveContainer" containerID="1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.652030 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-swv9b" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.678400 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-swv9b"] Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.684470 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-swv9b"] Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.685336 4749 scope.go:117] "RemoveContainer" containerID="30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.707166 4749 scope.go:117] "RemoveContainer" containerID="cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.731535 4749 scope.go:117] "RemoveContainer" containerID="1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa" Mar 10 16:51:01 crc kubenswrapper[4749]: E0310 16:51:01.732050 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa\": container with ID starting with 1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa not found: ID does not exist" containerID="1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.732081 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa"} err="failed to get container status \"1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa\": rpc error: code = NotFound desc = could not find container \"1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa\": container with ID starting with 1448d90e04b7ff2a0a2bdf9805d95fde59014f45af3ed9e91414a2eb5be337fa not found: ID does not exist" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.732105 4749 scope.go:117] "RemoveContainer" containerID="30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b" Mar 10 16:51:01 crc kubenswrapper[4749]: E0310 16:51:01.732806 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b\": container with ID starting with 30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b not found: ID does not exist" containerID="30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.732827 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b"} err="failed to get container status \"30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b\": rpc error: code = NotFound desc = could not find container \"30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b\": container with ID starting with 30f50201a5244e1ba9c95f3f419dde79df4f392244186af0d360240a8eddeb1b not found: ID does not exist" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.732841 4749 scope.go:117] "RemoveContainer" containerID="cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84" Mar 10 16:51:01 crc kubenswrapper[4749]: E0310 16:51:01.733117 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84\": container with ID starting with cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84 not found: ID does not exist" containerID="cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84" Mar 10 16:51:01 crc kubenswrapper[4749]: I0310 16:51:01.733225 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84"} err="failed to get container status \"cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84\": rpc error: code = NotFound desc = could not find container \"cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84\": container with ID starting with cf67f9fdaaf2ac3e42bccb84a13009ab3b998bf6ba0408958562c3b98a3ffd84 not found: ID does not exist" Mar 10 16:51:03 crc kubenswrapper[4749]: I0310 16:51:03.616661 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" path="/var/lib/kubelet/pods/4a76b458-802a-4b0a-9848-b02892a20c57/volumes" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.135932 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552692-7pnnm"] Mar 10 16:52:00 crc kubenswrapper[4749]: E0310 16:52:00.136754 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="extract-content" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.136769 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="extract-content" Mar 10 16:52:00 crc kubenswrapper[4749]: E0310 16:52:00.136801 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="registry-server" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.136811 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="registry-server" Mar 10 16:52:00 crc kubenswrapper[4749]: E0310 16:52:00.136825 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="extract-utilities" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.136833 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="extract-utilities" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.136986 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a76b458-802a-4b0a-9848-b02892a20c57" containerName="registry-server" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.137456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.140224 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.140337 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.141111 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.151867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552692-7pnnm"] Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.186502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5npj\" (UniqueName: \"kubernetes.io/projected/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16-kube-api-access-v5npj\") pod \"auto-csr-approver-29552692-7pnnm\" (UID: \"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16\") " pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.287435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5npj\" (UniqueName: \"kubernetes.io/projected/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16-kube-api-access-v5npj\") pod \"auto-csr-approver-29552692-7pnnm\" (UID: \"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16\") " pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.309397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5npj\" (UniqueName: \"kubernetes.io/projected/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16-kube-api-access-v5npj\") pod \"auto-csr-approver-29552692-7pnnm\" (UID: \"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16\") " pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.462250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:00 crc kubenswrapper[4749]: I0310 16:52:00.704604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552692-7pnnm"] Mar 10 16:52:01 crc kubenswrapper[4749]: I0310 16:52:01.103241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" event={"ID":"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16","Type":"ContainerStarted","Data":"4ae19df69d26c22edf7b02d2e9929cf6cedd0f524e8c20eac92c089ea100e889"} Mar 10 16:52:02 crc kubenswrapper[4749]: I0310 16:52:02.111796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" event={"ID":"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16","Type":"ContainerStarted","Data":"7af55ea33703b408b500778b57ba189ff9753b30f2fcfb2012ea922449b648cc"} Mar 10 16:52:02 crc kubenswrapper[4749]: I0310 16:52:02.135421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" podStartSLOduration=1.097896477 podStartE2EDuration="2.135402181s" podCreationTimestamp="2026-03-10 16:52:00 +0000 UTC" firstStartedPulling="2026-03-10 16:52:00.709164953 +0000 UTC m=+3817.831030640" lastFinishedPulling="2026-03-10 16:52:01.746670657 +0000 UTC m=+3818.868536344" observedRunningTime="2026-03-10 16:52:02.132482252 +0000 UTC m=+3819.254347949" watchObservedRunningTime="2026-03-10 16:52:02.135402181 +0000 UTC m=+3819.257267858" Mar 10 16:52:03 crc kubenswrapper[4749]: I0310 16:52:03.121076 4749 generic.go:334] "Generic (PLEG): container finished" podID="c82c5f08-6ea1-4f78-af60-7d14d3ecaf16" containerID="7af55ea33703b408b500778b57ba189ff9753b30f2fcfb2012ea922449b648cc" exitCode=0 Mar 10 16:52:03 crc kubenswrapper[4749]: I0310 16:52:03.121114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" event={"ID":"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16","Type":"ContainerDied","Data":"7af55ea33703b408b500778b57ba189ff9753b30f2fcfb2012ea922449b648cc"} Mar 10 16:52:04 crc kubenswrapper[4749]: I0310 16:52:04.434953 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:04 crc kubenswrapper[4749]: I0310 16:52:04.558851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5npj\" (UniqueName: \"kubernetes.io/projected/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16-kube-api-access-v5npj\") pod \"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16\" (UID: \"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16\") " Mar 10 16:52:04 crc kubenswrapper[4749]: I0310 16:52:04.564403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16-kube-api-access-v5npj" (OuterVolumeSpecName: "kube-api-access-v5npj") pod "c82c5f08-6ea1-4f78-af60-7d14d3ecaf16" (UID: "c82c5f08-6ea1-4f78-af60-7d14d3ecaf16"). InnerVolumeSpecName "kube-api-access-v5npj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:52:04 crc kubenswrapper[4749]: I0310 16:52:04.660339 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5npj\" (UniqueName: \"kubernetes.io/projected/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16-kube-api-access-v5npj\") on node \"crc\" DevicePath \"\"" Mar 10 16:52:05 crc kubenswrapper[4749]: I0310 16:52:05.136530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" event={"ID":"c82c5f08-6ea1-4f78-af60-7d14d3ecaf16","Type":"ContainerDied","Data":"4ae19df69d26c22edf7b02d2e9929cf6cedd0f524e8c20eac92c089ea100e889"} Mar 10 16:52:05 crc kubenswrapper[4749]: I0310 16:52:05.136572 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ae19df69d26c22edf7b02d2e9929cf6cedd0f524e8c20eac92c089ea100e889" Mar 10 16:52:05 crc kubenswrapper[4749]: I0310 16:52:05.136578 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552692-7pnnm" Mar 10 16:52:05 crc kubenswrapper[4749]: I0310 16:52:05.208132 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552686-5856b"] Mar 10 16:52:05 crc kubenswrapper[4749]: I0310 16:52:05.213747 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552686-5856b"] Mar 10 16:52:05 crc kubenswrapper[4749]: I0310 16:52:05.618529 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d8c93e-907c-40c3-9b96-efdddddf7f41" path="/var/lib/kubelet/pods/a0d8c93e-907c-40c3-9b96-efdddddf7f41/volumes" Mar 10 16:52:25 crc kubenswrapper[4749]: I0310 16:52:25.763810 4749 scope.go:117] "RemoveContainer" containerID="7d392fe96182774bb7ab3025c83fe18cec6f3ea49c09acce9437864ac4a11a64" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.437114 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-snjmd"] Mar 10 16:52:35 crc kubenswrapper[4749]: E0310 16:52:35.438491 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82c5f08-6ea1-4f78-af60-7d14d3ecaf16" containerName="oc" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.438652 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82c5f08-6ea1-4f78-af60-7d14d3ecaf16" containerName="oc" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.443783 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82c5f08-6ea1-4f78-af60-7d14d3ecaf16" containerName="oc" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.448715 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.476174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snjmd"] Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.527628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfnxc\" (UniqueName: \"kubernetes.io/projected/35489642-ba38-4b8f-b486-d331995d68e8-kube-api-access-vfnxc\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.527723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-utilities\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.527764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-catalog-content\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.629670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-utilities\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.629758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-catalog-content\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.629831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfnxc\" (UniqueName: \"kubernetes.io/projected/35489642-ba38-4b8f-b486-d331995d68e8-kube-api-access-vfnxc\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.630655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-utilities\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.631259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-catalog-content\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.652283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfnxc\" (UniqueName: \"kubernetes.io/projected/35489642-ba38-4b8f-b486-d331995d68e8-kube-api-access-vfnxc\") pod \"certified-operators-snjmd\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:35 crc kubenswrapper[4749]: I0310 16:52:35.780937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:36 crc kubenswrapper[4749]: I0310 16:52:36.561736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-snjmd"] Mar 10 16:52:37 crc kubenswrapper[4749]: I0310 16:52:37.379325 4749 generic.go:334] "Generic (PLEG): container finished" podID="35489642-ba38-4b8f-b486-d331995d68e8" containerID="8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03" exitCode=0 Mar 10 16:52:37 crc kubenswrapper[4749]: I0310 16:52:37.379391 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snjmd" event={"ID":"35489642-ba38-4b8f-b486-d331995d68e8","Type":"ContainerDied","Data":"8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03"} Mar 10 16:52:37 crc kubenswrapper[4749]: I0310 16:52:37.379671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snjmd" event={"ID":"35489642-ba38-4b8f-b486-d331995d68e8","Type":"ContainerStarted","Data":"cf249396bf212aa752c5dfe82aed197492d63520db01adfb46bc43caa04cdb73"} Mar 10 16:52:39 crc kubenswrapper[4749]: I0310 16:52:39.397582 4749 generic.go:334] "Generic (PLEG): container finished" podID="35489642-ba38-4b8f-b486-d331995d68e8" containerID="ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64" exitCode=0 Mar 10 16:52:39 crc kubenswrapper[4749]: I0310 16:52:39.397658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snjmd" event={"ID":"35489642-ba38-4b8f-b486-d331995d68e8","Type":"ContainerDied","Data":"ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64"} Mar 10 16:52:40 crc kubenswrapper[4749]: I0310 16:52:40.407504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snjmd" event={"ID":"35489642-ba38-4b8f-b486-d331995d68e8","Type":"ContainerStarted","Data":"abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc"} Mar 10 16:52:40 crc kubenswrapper[4749]: I0310 16:52:40.440556 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-snjmd" podStartSLOduration=2.988047151 podStartE2EDuration="5.440524009s" podCreationTimestamp="2026-03-10 16:52:35 +0000 UTC" firstStartedPulling="2026-03-10 16:52:37.381837884 +0000 UTC m=+3854.503703581" lastFinishedPulling="2026-03-10 16:52:39.834314752 +0000 UTC m=+3856.956180439" observedRunningTime="2026-03-10 16:52:40.43318467 +0000 UTC m=+3857.555050357" watchObservedRunningTime="2026-03-10 16:52:40.440524009 +0000 UTC m=+3857.562389726" Mar 10 16:52:45 crc kubenswrapper[4749]: I0310 16:52:45.781808 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:45 crc kubenswrapper[4749]: I0310 16:52:45.782291 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:45 crc kubenswrapper[4749]: I0310 16:52:45.857655 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:46 crc kubenswrapper[4749]: I0310 16:52:46.530035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:46 crc kubenswrapper[4749]: I0310 16:52:46.598237 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snjmd"] Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.471467 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-snjmd" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="registry-server" containerID="cri-o://abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc" gracePeriod=2 Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.838976 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.932289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-utilities\") pod \"35489642-ba38-4b8f-b486-d331995d68e8\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.932404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-catalog-content\") pod \"35489642-ba38-4b8f-b486-d331995d68e8\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.932484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfnxc\" (UniqueName: \"kubernetes.io/projected/35489642-ba38-4b8f-b486-d331995d68e8-kube-api-access-vfnxc\") pod \"35489642-ba38-4b8f-b486-d331995d68e8\" (UID: \"35489642-ba38-4b8f-b486-d331995d68e8\") " Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.933705 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-utilities" (OuterVolumeSpecName: "utilities") pod "35489642-ba38-4b8f-b486-d331995d68e8" (UID: "35489642-ba38-4b8f-b486-d331995d68e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.937140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35489642-ba38-4b8f-b486-d331995d68e8-kube-api-access-vfnxc" (OuterVolumeSpecName: "kube-api-access-vfnxc") pod "35489642-ba38-4b8f-b486-d331995d68e8" (UID: "35489642-ba38-4b8f-b486-d331995d68e8"). InnerVolumeSpecName "kube-api-access-vfnxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:52:48 crc kubenswrapper[4749]: I0310 16:52:48.988362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35489642-ba38-4b8f-b486-d331995d68e8" (UID: "35489642-ba38-4b8f-b486-d331995d68e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.034258 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.034457 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35489642-ba38-4b8f-b486-d331995d68e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.034551 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfnxc\" (UniqueName: \"kubernetes.io/projected/35489642-ba38-4b8f-b486-d331995d68e8-kube-api-access-vfnxc\") on node \"crc\" DevicePath \"\"" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.487042 4749 generic.go:334] "Generic (PLEG): container finished" podID="35489642-ba38-4b8f-b486-d331995d68e8" containerID="abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc" exitCode=0 Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.487112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snjmd" event={"ID":"35489642-ba38-4b8f-b486-d331995d68e8","Type":"ContainerDied","Data":"abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc"} Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.487143 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-snjmd" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.487170 4749 scope.go:117] "RemoveContainer" containerID="abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.487153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-snjmd" event={"ID":"35489642-ba38-4b8f-b486-d331995d68e8","Type":"ContainerDied","Data":"cf249396bf212aa752c5dfe82aed197492d63520db01adfb46bc43caa04cdb73"} Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.521108 4749 scope.go:117] "RemoveContainer" containerID="ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.522216 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-snjmd"] Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.538071 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-snjmd"] Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.542737 4749 scope.go:117] "RemoveContainer" containerID="8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.571819 4749 scope.go:117] "RemoveContainer" containerID="abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc" Mar 10 16:52:49 crc kubenswrapper[4749]: E0310 16:52:49.575080 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc\": container with ID starting with abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc not found: ID does not exist" containerID="abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.575114 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc"} err="failed to get container status \"abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc\": rpc error: code = NotFound desc = could not find container \"abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc\": container with ID starting with abe34a7616d1f51b36a8c8269cc4f9a1f377b31973533498637db29b7f452cfc not found: ID does not exist" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.575135 4749 scope.go:117] "RemoveContainer" containerID="ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64" Mar 10 16:52:49 crc kubenswrapper[4749]: E0310 16:52:49.575546 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64\": container with ID starting with ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64 not found: ID does not exist" containerID="ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.575572 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64"} err="failed to get container status \"ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64\": rpc error: code = NotFound desc = could not find container \"ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64\": container with ID starting with ddb85a70890ed34d7a69ac0f0b375a7d7f27f9a63ed0cbc84021f57704300f64 not found: ID does not exist" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.575584 4749 scope.go:117] "RemoveContainer" containerID="8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03" Mar 10 16:52:49 crc kubenswrapper[4749]: E0310 16:52:49.575815 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03\": container with ID starting with 8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03 not found: ID does not exist" containerID="8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.575838 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03"} err="failed to get container status \"8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03\": rpc error: code = NotFound desc = could not find container \"8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03\": container with ID starting with 8af079d7a8c1f25058cf6bc5cc42bc13c723f84bca9c6de5e4e22a75db444f03 not found: ID does not exist" Mar 10 16:52:49 crc kubenswrapper[4749]: I0310 16:52:49.614274 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35489642-ba38-4b8f-b486-d331995d68e8" path="/var/lib/kubelet/pods/35489642-ba38-4b8f-b486-d331995d68e8/volumes" Mar 10 16:52:50 crc kubenswrapper[4749]: I0310 16:52:50.980970 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:52:50 crc kubenswrapper[4749]: I0310 16:52:50.981070 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:53:20 crc kubenswrapper[4749]: I0310 16:53:20.980410 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:53:20 crc kubenswrapper[4749]: I0310 16:53:20.980962 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:53:50 crc kubenswrapper[4749]: I0310 16:53:50.980371 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:53:50 crc kubenswrapper[4749]: I0310 16:53:50.981005 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:53:50 crc kubenswrapper[4749]: I0310 16:53:50.981069 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 16:53:50 crc kubenswrapper[4749]: I0310 16:53:50.981834 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:53:50 crc kubenswrapper[4749]: I0310 16:53:50.981930 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" gracePeriod=600 Mar 10 16:53:51 crc kubenswrapper[4749]: E0310 16:53:51.650960 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:53:51 crc kubenswrapper[4749]: I0310 16:53:51.742767 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" exitCode=0 Mar 10 16:53:51 crc kubenswrapper[4749]: I0310 16:53:51.742831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b"} Mar 10 16:53:51 crc kubenswrapper[4749]: I0310 16:53:51.742897 4749 scope.go:117] "RemoveContainer" containerID="74293f44aec15e963c26cf00402fbd62f5c1d2c0bf3a5e91e07dee6478a803b1" Mar 10 16:53:51 crc kubenswrapper[4749]: I0310 16:53:51.743579 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:53:51 crc kubenswrapper[4749]: E0310 16:53:51.743903 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.145276 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552694-px97l"] Mar 10 16:54:00 crc kubenswrapper[4749]: E0310 16:54:00.146119 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="extract-utilities" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.146136 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="extract-utilities" Mar 10 16:54:00 crc kubenswrapper[4749]: E0310 16:54:00.146165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="extract-content" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.146174 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="extract-content" Mar 10 16:54:00 crc kubenswrapper[4749]: E0310 16:54:00.146188 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="registry-server" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.146199 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="registry-server" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.146365 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="35489642-ba38-4b8f-b486-d331995d68e8" containerName="registry-server" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.146893 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.151463 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.151734 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.151867 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.157893 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552694-px97l"] Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.276163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7r9d\" (UniqueName: \"kubernetes.io/projected/a45251be-2727-494c-aacc-4695243ad22c-kube-api-access-v7r9d\") pod \"auto-csr-approver-29552694-px97l\" (UID: \"a45251be-2727-494c-aacc-4695243ad22c\") " pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.377605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7r9d\" (UniqueName: \"kubernetes.io/projected/a45251be-2727-494c-aacc-4695243ad22c-kube-api-access-v7r9d\") pod \"auto-csr-approver-29552694-px97l\" (UID: \"a45251be-2727-494c-aacc-4695243ad22c\") " pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.570063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7r9d\" (UniqueName: \"kubernetes.io/projected/a45251be-2727-494c-aacc-4695243ad22c-kube-api-access-v7r9d\") pod \"auto-csr-approver-29552694-px97l\" (UID: \"a45251be-2727-494c-aacc-4695243ad22c\") " pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:00 crc kubenswrapper[4749]: I0310 16:54:00.787443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:01 crc kubenswrapper[4749]: I0310 16:54:01.206413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552694-px97l"] Mar 10 16:54:01 crc kubenswrapper[4749]: I0310 16:54:01.209758 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:54:01 crc kubenswrapper[4749]: I0310 16:54:01.820843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552694-px97l" event={"ID":"a45251be-2727-494c-aacc-4695243ad22c","Type":"ContainerStarted","Data":"659b5eb1c4ddde54590b45dba51e92428bc2bc851e081a70ee975d36ce81f92a"} Mar 10 16:54:03 crc kubenswrapper[4749]: I0310 16:54:03.837045 4749 generic.go:334] "Generic (PLEG): container finished" podID="a45251be-2727-494c-aacc-4695243ad22c" containerID="2acd43e99cd8916a09584e0eeba6ea31e943628202fecc65862ef0a9ae028027" exitCode=0 Mar 10 16:54:03 crc kubenswrapper[4749]: I0310 16:54:03.837145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552694-px97l" event={"ID":"a45251be-2727-494c-aacc-4695243ad22c","Type":"ContainerDied","Data":"2acd43e99cd8916a09584e0eeba6ea31e943628202fecc65862ef0a9ae028027"} Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.088375 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.251273 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7r9d\" (UniqueName: \"kubernetes.io/projected/a45251be-2727-494c-aacc-4695243ad22c-kube-api-access-v7r9d\") pod \"a45251be-2727-494c-aacc-4695243ad22c\" (UID: \"a45251be-2727-494c-aacc-4695243ad22c\") " Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.263663 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45251be-2727-494c-aacc-4695243ad22c-kube-api-access-v7r9d" (OuterVolumeSpecName: "kube-api-access-v7r9d") pod "a45251be-2727-494c-aacc-4695243ad22c" (UID: "a45251be-2727-494c-aacc-4695243ad22c"). InnerVolumeSpecName "kube-api-access-v7r9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.352428 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7r9d\" (UniqueName: \"kubernetes.io/projected/a45251be-2727-494c-aacc-4695243ad22c-kube-api-access-v7r9d\") on node \"crc\" DevicePath \"\"" Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.852506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552694-px97l" event={"ID":"a45251be-2727-494c-aacc-4695243ad22c","Type":"ContainerDied","Data":"659b5eb1c4ddde54590b45dba51e92428bc2bc851e081a70ee975d36ce81f92a"} Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.852838 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="659b5eb1c4ddde54590b45dba51e92428bc2bc851e081a70ee975d36ce81f92a" Mar 10 16:54:05 crc kubenswrapper[4749]: I0310 16:54:05.852578 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552694-px97l" Mar 10 16:54:06 crc kubenswrapper[4749]: I0310 16:54:06.155859 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552688-r4bj7"] Mar 10 16:54:06 crc kubenswrapper[4749]: I0310 16:54:06.162568 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552688-r4bj7"] Mar 10 16:54:06 crc kubenswrapper[4749]: I0310 16:54:06.607103 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:54:06 crc kubenswrapper[4749]: E0310 16:54:06.607316 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:54:07 crc kubenswrapper[4749]: I0310 16:54:07.615859 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db2d468-b4bb-4df7-b02c-26c2c0dad9f3" path="/var/lib/kubelet/pods/8db2d468-b4bb-4df7-b02c-26c2c0dad9f3/volumes" Mar 10 16:54:18 crc kubenswrapper[4749]: I0310 16:54:18.607201 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:54:18 crc kubenswrapper[4749]: E0310 16:54:18.608498 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:54:25 crc kubenswrapper[4749]: I0310 16:54:25.863473 4749 scope.go:117] "RemoveContainer" containerID="5c6d9540fb87ee5c46f7d5c91f937bfeb2c646574029c08739852705606a60b8" Mar 10 16:54:31 crc kubenswrapper[4749]: I0310 16:54:31.607086 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:54:31 crc kubenswrapper[4749]: E0310 16:54:31.607864 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:54:46 crc kubenswrapper[4749]: I0310 16:54:46.608045 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:54:46 crc kubenswrapper[4749]: E0310 16:54:46.609197 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:54:59 crc kubenswrapper[4749]: I0310 16:54:59.606727 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:54:59 crc kubenswrapper[4749]: E0310 16:54:59.607732 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:55:10 crc kubenswrapper[4749]: I0310 16:55:10.606454 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:55:10 crc kubenswrapper[4749]: E0310 16:55:10.607066 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.816241 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vb8jm"] Mar 10 16:55:19 crc kubenswrapper[4749]: E0310 16:55:19.817165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45251be-2727-494c-aacc-4695243ad22c" containerName="oc" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.817181 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45251be-2727-494c-aacc-4695243ad22c" containerName="oc" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.817339 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45251be-2727-494c-aacc-4695243ad22c" containerName="oc" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.818926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.831483 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vb8jm"] Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.873534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744279f2-54a7-4fac-97eb-857784a119fb-utilities\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.873639 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744279f2-54a7-4fac-97eb-857784a119fb-catalog-content\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.873678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbrc2\" (UniqueName: \"kubernetes.io/projected/744279f2-54a7-4fac-97eb-857784a119fb-kube-api-access-bbrc2\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.974226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744279f2-54a7-4fac-97eb-857784a119fb-catalog-content\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.974773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbrc2\" (UniqueName: \"kubernetes.io/projected/744279f2-54a7-4fac-97eb-857784a119fb-kube-api-access-bbrc2\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.974713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744279f2-54a7-4fac-97eb-857784a119fb-catalog-content\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.975253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744279f2-54a7-4fac-97eb-857784a119fb-utilities\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:19 crc kubenswrapper[4749]: I0310 16:55:19.975714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744279f2-54a7-4fac-97eb-857784a119fb-utilities\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:20 crc kubenswrapper[4749]: I0310 16:55:20.000986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbrc2\" (UniqueName: \"kubernetes.io/projected/744279f2-54a7-4fac-97eb-857784a119fb-kube-api-access-bbrc2\") pod \"redhat-operators-vb8jm\" (UID: \"744279f2-54a7-4fac-97eb-857784a119fb\") " pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:20 crc kubenswrapper[4749]: I0310 16:55:20.145100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:20 crc kubenswrapper[4749]: I0310 16:55:20.616121 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vb8jm"] Mar 10 16:55:20 crc kubenswrapper[4749]: I0310 16:55:20.710466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb8jm" event={"ID":"744279f2-54a7-4fac-97eb-857784a119fb","Type":"ContainerStarted","Data":"0d7f1df26d4045c5f985fe8798745d75bb8300d23de036f4d6977a527e2e55fb"} Mar 10 16:55:21 crc kubenswrapper[4749]: I0310 16:55:21.607055 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:55:21 crc kubenswrapper[4749]: E0310 16:55:21.607715 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:55:21 crc kubenswrapper[4749]: I0310 16:55:21.718690 4749 generic.go:334] "Generic (PLEG): container finished" podID="744279f2-54a7-4fac-97eb-857784a119fb" containerID="fbc737f55920820c5e59e51819e1e3b40c69c46715149253245eec25e5717860" exitCode=0 Mar 10 16:55:21 crc kubenswrapper[4749]: I0310 16:55:21.718731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb8jm" event={"ID":"744279f2-54a7-4fac-97eb-857784a119fb","Type":"ContainerDied","Data":"fbc737f55920820c5e59e51819e1e3b40c69c46715149253245eec25e5717860"} Mar 10 16:55:29 crc kubenswrapper[4749]: I0310 16:55:29.787144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb8jm" event={"ID":"744279f2-54a7-4fac-97eb-857784a119fb","Type":"ContainerStarted","Data":"782a2b656509f3a0f3a8ce909a2d5f6593abc027b73d70111b955f308ede751a"} Mar 10 16:55:30 crc kubenswrapper[4749]: I0310 16:55:30.795615 4749 generic.go:334] "Generic (PLEG): container finished" podID="744279f2-54a7-4fac-97eb-857784a119fb" containerID="782a2b656509f3a0f3a8ce909a2d5f6593abc027b73d70111b955f308ede751a" exitCode=0 Mar 10 16:55:30 crc kubenswrapper[4749]: I0310 16:55:30.795737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb8jm" event={"ID":"744279f2-54a7-4fac-97eb-857784a119fb","Type":"ContainerDied","Data":"782a2b656509f3a0f3a8ce909a2d5f6593abc027b73d70111b955f308ede751a"} Mar 10 16:55:31 crc kubenswrapper[4749]: I0310 16:55:31.804760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb8jm" event={"ID":"744279f2-54a7-4fac-97eb-857784a119fb","Type":"ContainerStarted","Data":"03932bc19e5414bd6bfb39cf5c14a5d7b6ab3eac9e73841235a7da1482e208ee"} Mar 10 16:55:31 crc kubenswrapper[4749]: I0310 16:55:31.821940 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vb8jm" podStartSLOduration=3.342683067 podStartE2EDuration="12.821922527s" podCreationTimestamp="2026-03-10 16:55:19 +0000 UTC" firstStartedPulling="2026-03-10 16:55:21.720367606 +0000 UTC m=+4018.842233293" lastFinishedPulling="2026-03-10 16:55:31.199607076 +0000 UTC m=+4028.321472753" observedRunningTime="2026-03-10 16:55:31.818605787 +0000 UTC m=+4028.940471474" watchObservedRunningTime="2026-03-10 16:55:31.821922527 +0000 UTC m=+4028.943788214" Mar 10 16:55:34 crc kubenswrapper[4749]: I0310 16:55:34.606757 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:55:34 crc kubenswrapper[4749]: E0310 16:55:34.607427 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:55:40 crc kubenswrapper[4749]: I0310 16:55:40.146266 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:40 crc kubenswrapper[4749]: I0310 16:55:40.146869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:40 crc kubenswrapper[4749]: I0310 16:55:40.185469 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:40 crc kubenswrapper[4749]: I0310 16:55:40.909044 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vb8jm" Mar 10 16:55:40 crc kubenswrapper[4749]: I0310 16:55:40.973827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vb8jm"] Mar 10 16:55:41 crc kubenswrapper[4749]: I0310 16:55:41.021397 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:55:41 crc kubenswrapper[4749]: I0310 16:55:41.021683 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q5cvw" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="registry-server" containerID="cri-o://7111f9a04cb360503aa14adb4b460e3c8c75c67b6a681a6e43bfa174016cc567" gracePeriod=2 Mar 10 16:55:43 crc kubenswrapper[4749]: I0310 16:55:43.883013 4749 generic.go:334] "Generic (PLEG): container finished" podID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerID="7111f9a04cb360503aa14adb4b460e3c8c75c67b6a681a6e43bfa174016cc567" exitCode=0 Mar 10 16:55:43 crc kubenswrapper[4749]: I0310 16:55:43.883091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerDied","Data":"7111f9a04cb360503aa14adb4b460e3c8c75c67b6a681a6e43bfa174016cc567"} Mar 10 16:55:44 crc kubenswrapper[4749]: I0310 16:55:44.970234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.150965 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r9zz\" (UniqueName: \"kubernetes.io/projected/904b24b3-e0d3-452a-855c-3cfc2f78a152-kube-api-access-8r9zz\") pod \"904b24b3-e0d3-452a-855c-3cfc2f78a152\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.151156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-catalog-content\") pod \"904b24b3-e0d3-452a-855c-3cfc2f78a152\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.151225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-utilities\") pod \"904b24b3-e0d3-452a-855c-3cfc2f78a152\" (UID: \"904b24b3-e0d3-452a-855c-3cfc2f78a152\") " Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.152175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-utilities" (OuterVolumeSpecName: "utilities") pod "904b24b3-e0d3-452a-855c-3cfc2f78a152" (UID: "904b24b3-e0d3-452a-855c-3cfc2f78a152"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.156099 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/904b24b3-e0d3-452a-855c-3cfc2f78a152-kube-api-access-8r9zz" (OuterVolumeSpecName: "kube-api-access-8r9zz") pod "904b24b3-e0d3-452a-855c-3cfc2f78a152" (UID: "904b24b3-e0d3-452a-855c-3cfc2f78a152"). InnerVolumeSpecName "kube-api-access-8r9zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.252921 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.252967 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r9zz\" (UniqueName: \"kubernetes.io/projected/904b24b3-e0d3-452a-855c-3cfc2f78a152-kube-api-access-8r9zz\") on node \"crc\" DevicePath \"\"" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.291858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "904b24b3-e0d3-452a-855c-3cfc2f78a152" (UID: "904b24b3-e0d3-452a-855c-3cfc2f78a152"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.353389 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/904b24b3-e0d3-452a-855c-3cfc2f78a152-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.899599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5cvw" event={"ID":"904b24b3-e0d3-452a-855c-3cfc2f78a152","Type":"ContainerDied","Data":"aada8b15bffba48008e8e58b6cb48731a3c83bb599c9a22c425f1363e58b2845"} Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.899905 4749 scope.go:117] "RemoveContainer" containerID="7111f9a04cb360503aa14adb4b460e3c8c75c67b6a681a6e43bfa174016cc567" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.899670 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5cvw" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.924891 4749 scope.go:117] "RemoveContainer" containerID="f988c6e1708cd842cf39a44fb18a84dfa26f998b9228002fd4d83112ae2803d3" Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.929752 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.938983 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q5cvw"] Mar 10 16:55:45 crc kubenswrapper[4749]: I0310 16:55:45.950526 4749 scope.go:117] "RemoveContainer" containerID="eead918b0c52354a9bdd0f6e1aea1e10631e5d6751e4c31d3e7783b00b26126e" Mar 10 16:55:47 crc kubenswrapper[4749]: I0310 16:55:47.607348 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:55:47 crc kubenswrapper[4749]: E0310 16:55:47.607853 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:55:47 crc kubenswrapper[4749]: I0310 16:55:47.619922 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" path="/var/lib/kubelet/pods/904b24b3-e0d3-452a-855c-3cfc2f78a152/volumes" Mar 10 16:55:58 crc kubenswrapper[4749]: I0310 16:55:58.608672 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:55:58 crc kubenswrapper[4749]: E0310 16:55:58.609802 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.143018 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552696-d876x"] Mar 10 16:56:00 crc kubenswrapper[4749]: E0310 16:56:00.143355 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="extract-content" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.143372 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="extract-content" Mar 10 16:56:00 crc kubenswrapper[4749]: E0310 16:56:00.143408 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="extract-utilities" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.143415 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="extract-utilities" Mar 10 16:56:00 crc kubenswrapper[4749]: E0310 16:56:00.143427 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="registry-server" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.143436 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="registry-server" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.143583 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="904b24b3-e0d3-452a-855c-3cfc2f78a152" containerName="registry-server" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.144086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.145924 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.147838 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.148000 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.159312 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552696-d876x"] Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.289355 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9gg\" (UniqueName: \"kubernetes.io/projected/cafef8ea-7c3f-485b-b796-6e93a47117a7-kube-api-access-vz9gg\") pod \"auto-csr-approver-29552696-d876x\" (UID: \"cafef8ea-7c3f-485b-b796-6e93a47117a7\") " pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.391173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9gg\" (UniqueName: \"kubernetes.io/projected/cafef8ea-7c3f-485b-b796-6e93a47117a7-kube-api-access-vz9gg\") pod \"auto-csr-approver-29552696-d876x\" (UID: \"cafef8ea-7c3f-485b-b796-6e93a47117a7\") " pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.428856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9gg\" (UniqueName: \"kubernetes.io/projected/cafef8ea-7c3f-485b-b796-6e93a47117a7-kube-api-access-vz9gg\") pod \"auto-csr-approver-29552696-d876x\" (UID: \"cafef8ea-7c3f-485b-b796-6e93a47117a7\") " pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.469151 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:00 crc kubenswrapper[4749]: I0310 16:56:00.928565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552696-d876x"] Mar 10 16:56:01 crc kubenswrapper[4749]: I0310 16:56:01.034199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552696-d876x" event={"ID":"cafef8ea-7c3f-485b-b796-6e93a47117a7","Type":"ContainerStarted","Data":"b9dc04fe08cd9461bb3c80eb369465880f528ed31bd7cfeec24c5163d32c01c1"} Mar 10 16:56:03 crc kubenswrapper[4749]: I0310 16:56:03.052306 4749 generic.go:334] "Generic (PLEG): container finished" podID="cafef8ea-7c3f-485b-b796-6e93a47117a7" containerID="46a20d137b3df3bbe3c383387cf5439471e3b4f73947010687a855eb3036eccb" exitCode=0 Mar 10 16:56:03 crc kubenswrapper[4749]: I0310 16:56:03.052388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552696-d876x" event={"ID":"cafef8ea-7c3f-485b-b796-6e93a47117a7","Type":"ContainerDied","Data":"46a20d137b3df3bbe3c383387cf5439471e3b4f73947010687a855eb3036eccb"} Mar 10 16:56:04 crc kubenswrapper[4749]: I0310 16:56:04.491876 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:04 crc kubenswrapper[4749]: I0310 16:56:04.546876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz9gg\" (UniqueName: \"kubernetes.io/projected/cafef8ea-7c3f-485b-b796-6e93a47117a7-kube-api-access-vz9gg\") pod \"cafef8ea-7c3f-485b-b796-6e93a47117a7\" (UID: \"cafef8ea-7c3f-485b-b796-6e93a47117a7\") " Mar 10 16:56:04 crc kubenswrapper[4749]: I0310 16:56:04.554058 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafef8ea-7c3f-485b-b796-6e93a47117a7-kube-api-access-vz9gg" (OuterVolumeSpecName: "kube-api-access-vz9gg") pod "cafef8ea-7c3f-485b-b796-6e93a47117a7" (UID: "cafef8ea-7c3f-485b-b796-6e93a47117a7"). InnerVolumeSpecName "kube-api-access-vz9gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:56:04 crc kubenswrapper[4749]: I0310 16:56:04.647901 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz9gg\" (UniqueName: \"kubernetes.io/projected/cafef8ea-7c3f-485b-b796-6e93a47117a7-kube-api-access-vz9gg\") on node \"crc\" DevicePath \"\"" Mar 10 16:56:05 crc kubenswrapper[4749]: I0310 16:56:05.067121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552696-d876x" event={"ID":"cafef8ea-7c3f-485b-b796-6e93a47117a7","Type":"ContainerDied","Data":"b9dc04fe08cd9461bb3c80eb369465880f528ed31bd7cfeec24c5163d32c01c1"} Mar 10 16:56:05 crc kubenswrapper[4749]: I0310 16:56:05.067417 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9dc04fe08cd9461bb3c80eb369465880f528ed31bd7cfeec24c5163d32c01c1" Mar 10 16:56:05 crc kubenswrapper[4749]: I0310 16:56:05.067519 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552696-d876x" Mar 10 16:56:05 crc kubenswrapper[4749]: I0310 16:56:05.566066 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552690-v6sjh"] Mar 10 16:56:05 crc kubenswrapper[4749]: I0310 16:56:05.574593 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552690-v6sjh"] Mar 10 16:56:05 crc kubenswrapper[4749]: I0310 16:56:05.618426 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700bba28-9400-4971-a767-0b7012b59a1d" path="/var/lib/kubelet/pods/700bba28-9400-4971-a767-0b7012b59a1d/volumes" Mar 10 16:56:09 crc kubenswrapper[4749]: I0310 16:56:09.607634 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:56:09 crc kubenswrapper[4749]: E0310 16:56:09.608099 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:56:20 crc kubenswrapper[4749]: I0310 16:56:20.607241 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:56:20 crc kubenswrapper[4749]: E0310 16:56:20.608060 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:56:25 crc kubenswrapper[4749]: I0310 16:56:25.963873 4749 scope.go:117] "RemoveContainer" containerID="077a582715f24a86e89cbbf5abbb33a598156559fb021acfd8a37775b3223464" Mar 10 16:56:33 crc kubenswrapper[4749]: I0310 16:56:33.616050 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:56:33 crc kubenswrapper[4749]: E0310 16:56:33.617305 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:56:47 crc kubenswrapper[4749]: I0310 16:56:47.607074 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:56:47 crc kubenswrapper[4749]: E0310 16:56:47.607905 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:57:02 crc kubenswrapper[4749]: I0310 16:57:02.606910 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:57:02 crc kubenswrapper[4749]: E0310 16:57:02.607762 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:57:15 crc kubenswrapper[4749]: I0310 16:57:15.607436 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:57:15 crc kubenswrapper[4749]: E0310 16:57:15.608347 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.617646 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7bkd4"] Mar 10 16:57:21 crc kubenswrapper[4749]: E0310 16:57:21.619741 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafef8ea-7c3f-485b-b796-6e93a47117a7" containerName="oc" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.619860 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafef8ea-7c3f-485b-b796-6e93a47117a7" containerName="oc" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.620167 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cafef8ea-7c3f-485b-b796-6e93a47117a7" containerName="oc" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.621489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.631746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bkd4"] Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.759080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwq5\" (UniqueName: \"kubernetes.io/projected/30d4f359-b6d2-4542-8a3d-3d9f44990059-kube-api-access-rcwq5\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.759184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-catalog-content\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.759217 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-utilities\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.861191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwq5\" (UniqueName: \"kubernetes.io/projected/30d4f359-b6d2-4542-8a3d-3d9f44990059-kube-api-access-rcwq5\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.861250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-catalog-content\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.861267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-utilities\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.861947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-catalog-content\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.862081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-utilities\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.886226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwq5\" (UniqueName: \"kubernetes.io/projected/30d4f359-b6d2-4542-8a3d-3d9f44990059-kube-api-access-rcwq5\") pod \"redhat-marketplace-7bkd4\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:21 crc kubenswrapper[4749]: I0310 16:57:21.945859 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:22 crc kubenswrapper[4749]: I0310 16:57:22.426195 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bkd4"] Mar 10 16:57:22 crc kubenswrapper[4749]: I0310 16:57:22.730803 4749 generic.go:334] "Generic (PLEG): container finished" podID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerID="54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d" exitCode=0 Mar 10 16:57:22 crc kubenswrapper[4749]: I0310 16:57:22.730862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bkd4" event={"ID":"30d4f359-b6d2-4542-8a3d-3d9f44990059","Type":"ContainerDied","Data":"54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d"} Mar 10 16:57:22 crc kubenswrapper[4749]: I0310 16:57:22.730898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bkd4" event={"ID":"30d4f359-b6d2-4542-8a3d-3d9f44990059","Type":"ContainerStarted","Data":"660f3222f1619502f904aa530941af1c392bd0d3d20732b54bae43ae02279144"} Mar 10 16:57:24 crc kubenswrapper[4749]: I0310 16:57:24.747423 4749 generic.go:334] "Generic (PLEG): container finished" podID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerID="12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402" exitCode=0 Mar 10 16:57:24 crc kubenswrapper[4749]: I0310 16:57:24.747519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bkd4" event={"ID":"30d4f359-b6d2-4542-8a3d-3d9f44990059","Type":"ContainerDied","Data":"12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402"} Mar 10 16:57:25 crc kubenswrapper[4749]: I0310 16:57:25.756182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bkd4" event={"ID":"30d4f359-b6d2-4542-8a3d-3d9f44990059","Type":"ContainerStarted","Data":"3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d"} Mar 10 16:57:25 crc kubenswrapper[4749]: I0310 16:57:25.779001 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7bkd4" podStartSLOduration=2.117476334 podStartE2EDuration="4.778979787s" podCreationTimestamp="2026-03-10 16:57:21 +0000 UTC" firstStartedPulling="2026-03-10 16:57:22.732891695 +0000 UTC m=+4139.854757382" lastFinishedPulling="2026-03-10 16:57:25.394395148 +0000 UTC m=+4142.516260835" observedRunningTime="2026-03-10 16:57:25.774435494 +0000 UTC m=+4142.896301201" watchObservedRunningTime="2026-03-10 16:57:25.778979787 +0000 UTC m=+4142.900845474" Mar 10 16:57:28 crc kubenswrapper[4749]: I0310 16:57:28.606880 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:57:28 crc kubenswrapper[4749]: E0310 16:57:28.607446 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:57:31 crc kubenswrapper[4749]: I0310 16:57:31.946482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:31 crc kubenswrapper[4749]: I0310 16:57:31.946902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:32 crc kubenswrapper[4749]: I0310 16:57:32.010174 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:32 crc kubenswrapper[4749]: I0310 16:57:32.846128 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:32 crc kubenswrapper[4749]: I0310 16:57:32.898456 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bkd4"] Mar 10 16:57:34 crc kubenswrapper[4749]: I0310 16:57:34.814792 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7bkd4" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="registry-server" containerID="cri-o://3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d" gracePeriod=2 Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.782237 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.824436 4749 generic.go:334] "Generic (PLEG): container finished" podID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerID="3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d" exitCode=0 Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.824476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bkd4" event={"ID":"30d4f359-b6d2-4542-8a3d-3d9f44990059","Type":"ContainerDied","Data":"3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d"} Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.824500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bkd4" event={"ID":"30d4f359-b6d2-4542-8a3d-3d9f44990059","Type":"ContainerDied","Data":"660f3222f1619502f904aa530941af1c392bd0d3d20732b54bae43ae02279144"} Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.824502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bkd4" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.824518 4749 scope.go:117] "RemoveContainer" containerID="3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.851561 4749 scope.go:117] "RemoveContainer" containerID="12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.864978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-utilities\") pod \"30d4f359-b6d2-4542-8a3d-3d9f44990059\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.865105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcwq5\" (UniqueName: \"kubernetes.io/projected/30d4f359-b6d2-4542-8a3d-3d9f44990059-kube-api-access-rcwq5\") pod \"30d4f359-b6d2-4542-8a3d-3d9f44990059\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.865145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-catalog-content\") pod \"30d4f359-b6d2-4542-8a3d-3d9f44990059\" (UID: \"30d4f359-b6d2-4542-8a3d-3d9f44990059\") " Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.866895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-utilities" (OuterVolumeSpecName: "utilities") pod "30d4f359-b6d2-4542-8a3d-3d9f44990059" (UID: "30d4f359-b6d2-4542-8a3d-3d9f44990059"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.866990 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.870178 4749 scope.go:117] "RemoveContainer" containerID="54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.871683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d4f359-b6d2-4542-8a3d-3d9f44990059-kube-api-access-rcwq5" (OuterVolumeSpecName: "kube-api-access-rcwq5") pod "30d4f359-b6d2-4542-8a3d-3d9f44990059" (UID: "30d4f359-b6d2-4542-8a3d-3d9f44990059"). InnerVolumeSpecName "kube-api-access-rcwq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.902103 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30d4f359-b6d2-4542-8a3d-3d9f44990059" (UID: "30d4f359-b6d2-4542-8a3d-3d9f44990059"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.915870 4749 scope.go:117] "RemoveContainer" containerID="3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d" Mar 10 16:57:35 crc kubenswrapper[4749]: E0310 16:57:35.916845 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d\": container with ID starting with 3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d not found: ID does not exist" containerID="3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.916951 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d"} err="failed to get container status \"3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d\": rpc error: code = NotFound desc = could not find container \"3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d\": container with ID starting with 3ee7a1b0ea2dbecea9f3adfd3844f803aa3ea0010c4ce68ec7b226514a887c3d not found: ID does not exist" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.917065 4749 scope.go:117] "RemoveContainer" containerID="12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402" Mar 10 16:57:35 crc kubenswrapper[4749]: E0310 16:57:35.917543 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402\": container with ID starting with 12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402 not found: ID does not exist" containerID="12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.917574 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402"} err="failed to get container status \"12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402\": rpc error: code = NotFound desc = could not find container \"12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402\": container with ID starting with 12542f8bef1ef82f672b3aa95362bd2db87ae58481243d7c6d30934a45d58402 not found: ID does not exist" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.917596 4749 scope.go:117] "RemoveContainer" containerID="54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d" Mar 10 16:57:35 crc kubenswrapper[4749]: E0310 16:57:35.917988 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d\": container with ID starting with 54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d not found: ID does not exist" containerID="54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.918014 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d"} err="failed to get container status \"54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d\": rpc error: code = NotFound desc = could not find container \"54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d\": container with ID starting with 54d07bd6dc1acd5f96f27d2f5980d62140f4009ad4b59a6179606e32b2821c1d not found: ID does not exist" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.968534 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcwq5\" (UniqueName: \"kubernetes.io/projected/30d4f359-b6d2-4542-8a3d-3d9f44990059-kube-api-access-rcwq5\") on node \"crc\" DevicePath \"\"" Mar 10 16:57:35 crc kubenswrapper[4749]: I0310 16:57:35.968587 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d4f359-b6d2-4542-8a3d-3d9f44990059-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:57:36 crc kubenswrapper[4749]: I0310 16:57:36.161101 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bkd4"] Mar 10 16:57:36 crc kubenswrapper[4749]: I0310 16:57:36.165972 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bkd4"] Mar 10 16:57:37 crc kubenswrapper[4749]: I0310 16:57:37.617181 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" path="/var/lib/kubelet/pods/30d4f359-b6d2-4542-8a3d-3d9f44990059/volumes" Mar 10 16:57:42 crc kubenswrapper[4749]: I0310 16:57:42.606900 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:57:42 crc kubenswrapper[4749]: E0310 16:57:42.607799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:57:55 crc kubenswrapper[4749]: I0310 16:57:55.607296 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:57:55 crc kubenswrapper[4749]: E0310 16:57:55.608324 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.146416 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552698-l9z46"] Mar 10 16:58:00 crc kubenswrapper[4749]: E0310 16:58:00.147366 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="extract-content" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.147396 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="extract-content" Mar 10 16:58:00 crc kubenswrapper[4749]: E0310 16:58:00.147414 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="registry-server" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.147420 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="registry-server" Mar 10 16:58:00 crc kubenswrapper[4749]: E0310 16:58:00.147434 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="extract-utilities" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.147440 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="extract-utilities" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.147598 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d4f359-b6d2-4542-8a3d-3d9f44990059" containerName="registry-server" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.148016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.150986 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.151276 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.152688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.154608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfjwm\" (UniqueName: \"kubernetes.io/projected/7b089428-11c1-44f6-bd22-dd3a2da969e2-kube-api-access-bfjwm\") pod \"auto-csr-approver-29552698-l9z46\" (UID: \"7b089428-11c1-44f6-bd22-dd3a2da969e2\") " pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.163085 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552698-l9z46"] Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.255847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfjwm\" (UniqueName: \"kubernetes.io/projected/7b089428-11c1-44f6-bd22-dd3a2da969e2-kube-api-access-bfjwm\") pod \"auto-csr-approver-29552698-l9z46\" (UID: \"7b089428-11c1-44f6-bd22-dd3a2da969e2\") " pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.367491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfjwm\" (UniqueName: \"kubernetes.io/projected/7b089428-11c1-44f6-bd22-dd3a2da969e2-kube-api-access-bfjwm\") pod \"auto-csr-approver-29552698-l9z46\" (UID: \"7b089428-11c1-44f6-bd22-dd3a2da969e2\") " pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.470835 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:00 crc kubenswrapper[4749]: I0310 16:58:00.918340 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552698-l9z46"] Mar 10 16:58:01 crc kubenswrapper[4749]: I0310 16:58:01.008153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552698-l9z46" event={"ID":"7b089428-11c1-44f6-bd22-dd3a2da969e2","Type":"ContainerStarted","Data":"e3f948bf63863dab65652d3b4f1b54fe33fd9dc17b77b2d923fff5d7da8e5891"} Mar 10 16:58:03 crc kubenswrapper[4749]: I0310 16:58:03.028113 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b089428-11c1-44f6-bd22-dd3a2da969e2" containerID="e57eb39468f4dcc29df16e35709e82753763f3b2d3147c6d885789d5440727b7" exitCode=0 Mar 10 16:58:03 crc kubenswrapper[4749]: I0310 16:58:03.028424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552698-l9z46" event={"ID":"7b089428-11c1-44f6-bd22-dd3a2da969e2","Type":"ContainerDied","Data":"e57eb39468f4dcc29df16e35709e82753763f3b2d3147c6d885789d5440727b7"} Mar 10 16:58:04 crc kubenswrapper[4749]: I0310 16:58:04.314091 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:04 crc kubenswrapper[4749]: I0310 16:58:04.416296 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfjwm\" (UniqueName: \"kubernetes.io/projected/7b089428-11c1-44f6-bd22-dd3a2da969e2-kube-api-access-bfjwm\") pod \"7b089428-11c1-44f6-bd22-dd3a2da969e2\" (UID: \"7b089428-11c1-44f6-bd22-dd3a2da969e2\") " Mar 10 16:58:04 crc kubenswrapper[4749]: I0310 16:58:04.423816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b089428-11c1-44f6-bd22-dd3a2da969e2-kube-api-access-bfjwm" (OuterVolumeSpecName: "kube-api-access-bfjwm") pod "7b089428-11c1-44f6-bd22-dd3a2da969e2" (UID: "7b089428-11c1-44f6-bd22-dd3a2da969e2"). InnerVolumeSpecName "kube-api-access-bfjwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:58:04 crc kubenswrapper[4749]: I0310 16:58:04.517503 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfjwm\" (UniqueName: \"kubernetes.io/projected/7b089428-11c1-44f6-bd22-dd3a2da969e2-kube-api-access-bfjwm\") on node \"crc\" DevicePath \"\"" Mar 10 16:58:05 crc kubenswrapper[4749]: I0310 16:58:05.049127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552698-l9z46" event={"ID":"7b089428-11c1-44f6-bd22-dd3a2da969e2","Type":"ContainerDied","Data":"e3f948bf63863dab65652d3b4f1b54fe33fd9dc17b77b2d923fff5d7da8e5891"} Mar 10 16:58:05 crc kubenswrapper[4749]: I0310 16:58:05.049196 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f948bf63863dab65652d3b4f1b54fe33fd9dc17b77b2d923fff5d7da8e5891" Mar 10 16:58:05 crc kubenswrapper[4749]: I0310 16:58:05.049309 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552698-l9z46" Mar 10 16:58:05 crc kubenswrapper[4749]: I0310 16:58:05.390777 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552692-7pnnm"] Mar 10 16:58:05 crc kubenswrapper[4749]: I0310 16:58:05.397817 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552692-7pnnm"] Mar 10 16:58:05 crc kubenswrapper[4749]: I0310 16:58:05.616481 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82c5f08-6ea1-4f78-af60-7d14d3ecaf16" path="/var/lib/kubelet/pods/c82c5f08-6ea1-4f78-af60-7d14d3ecaf16/volumes" Mar 10 16:58:09 crc kubenswrapper[4749]: I0310 16:58:09.606909 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:58:09 crc kubenswrapper[4749]: E0310 16:58:09.607597 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:58:20 crc kubenswrapper[4749]: I0310 16:58:20.607575 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:58:20 crc kubenswrapper[4749]: E0310 16:58:20.609237 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:58:26 crc kubenswrapper[4749]: I0310 16:58:26.065596 4749 scope.go:117] "RemoveContainer" containerID="7af55ea33703b408b500778b57ba189ff9753b30f2fcfb2012ea922449b648cc" Mar 10 16:58:35 crc kubenswrapper[4749]: I0310 16:58:35.606625 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:58:35 crc kubenswrapper[4749]: E0310 16:58:35.607542 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:58:47 crc kubenswrapper[4749]: I0310 16:58:47.607122 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:58:47 crc kubenswrapper[4749]: E0310 16:58:47.608009 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 16:58:59 crc kubenswrapper[4749]: I0310 16:58:59.606835 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 16:59:00 crc kubenswrapper[4749]: I0310 16:59:00.520183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"50143155b07c28aad6ceff4991afedb9e3d0e4ace9db1b7305fb156222ab014a"} Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.153874 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj"] Mar 10 17:00:00 crc kubenswrapper[4749]: E0310 17:00:00.154865 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b089428-11c1-44f6-bd22-dd3a2da969e2" containerName="oc" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.154887 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b089428-11c1-44f6-bd22-dd3a2da969e2" containerName="oc" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.155049 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b089428-11c1-44f6-bd22-dd3a2da969e2" containerName="oc" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.155647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.162919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.166106 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.177023 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552700-mspvw"] Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.179134 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.184185 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.184665 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.184802 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.196192 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj"] Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.210992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552700-mspvw"] Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.296568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rvm\" (UniqueName: \"kubernetes.io/projected/a4a50fbf-1816-4d0b-b32a-0ea4842556cc-kube-api-access-c4rvm\") pod \"auto-csr-approver-29552700-mspvw\" (UID: \"a4a50fbf-1816-4d0b-b32a-0ea4842556cc\") " pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.296627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhw9v\" (UniqueName: \"kubernetes.io/projected/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-kube-api-access-hhw9v\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.296679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-secret-volume\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.296742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-config-volume\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.397784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-config-volume\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.397850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rvm\" (UniqueName: \"kubernetes.io/projected/a4a50fbf-1816-4d0b-b32a-0ea4842556cc-kube-api-access-c4rvm\") pod \"auto-csr-approver-29552700-mspvw\" (UID: \"a4a50fbf-1816-4d0b-b32a-0ea4842556cc\") " pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.397881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhw9v\" (UniqueName: \"kubernetes.io/projected/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-kube-api-access-hhw9v\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.397948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-secret-volume\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.398778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-config-volume\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.407645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-secret-volume\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.421688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhw9v\" (UniqueName: \"kubernetes.io/projected/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-kube-api-access-hhw9v\") pod \"collect-profiles-29552700-75mwj\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.426081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rvm\" (UniqueName: \"kubernetes.io/projected/a4a50fbf-1816-4d0b-b32a-0ea4842556cc-kube-api-access-c4rvm\") pod \"auto-csr-approver-29552700-mspvw\" (UID: \"a4a50fbf-1816-4d0b-b32a-0ea4842556cc\") " pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.503816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.504842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.959865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552700-mspvw"] Mar 10 17:00:00 crc kubenswrapper[4749]: I0310 17:00:00.968736 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:00:01 crc kubenswrapper[4749]: I0310 17:00:01.011174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj"] Mar 10 17:00:01 crc kubenswrapper[4749]: W0310 17:00:01.014954 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b5557f9_e0d9_4d97_a1d5_7c1b79e4f5b6.slice/crio-2ea92a9a5a1d19ca244a7b185c4c53b53089254373252ff05e9c27b3d14d0f5b WatchSource:0}: Error finding container 2ea92a9a5a1d19ca244a7b185c4c53b53089254373252ff05e9c27b3d14d0f5b: Status 404 returned error can't find the container with id 2ea92a9a5a1d19ca244a7b185c4c53b53089254373252ff05e9c27b3d14d0f5b Mar 10 17:00:01 crc kubenswrapper[4749]: I0310 17:00:01.157206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" event={"ID":"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6","Type":"ContainerStarted","Data":"2ea92a9a5a1d19ca244a7b185c4c53b53089254373252ff05e9c27b3d14d0f5b"} Mar 10 17:00:01 crc kubenswrapper[4749]: I0310 17:00:01.160101 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552700-mspvw" event={"ID":"a4a50fbf-1816-4d0b-b32a-0ea4842556cc","Type":"ContainerStarted","Data":"4eb589f927137920358a63ca158584a297198a3db6fe86e160f4999bbd84eb26"} Mar 10 17:00:02 crc kubenswrapper[4749]: I0310 17:00:02.166984 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" containerID="22f7717a7a4f35adae422a2a61e35ab6c2e4705449439cad9997f4a0627d245d" exitCode=0 Mar 10 17:00:02 crc kubenswrapper[4749]: I0310 17:00:02.167030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" event={"ID":"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6","Type":"ContainerDied","Data":"22f7717a7a4f35adae422a2a61e35ab6c2e4705449439cad9997f4a0627d245d"} Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.457990 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.544940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-config-volume\") pod \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.545014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhw9v\" (UniqueName: \"kubernetes.io/projected/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-kube-api-access-hhw9v\") pod \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.545036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-secret-volume\") pod \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\" (UID: \"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6\") " Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.545812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" (UID: "7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.558654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" (UID: "7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.564582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-kube-api-access-hhw9v" (OuterVolumeSpecName: "kube-api-access-hhw9v") pod "7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" (UID: "7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6"). InnerVolumeSpecName "kube-api-access-hhw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.646414 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.646454 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhw9v\" (UniqueName: \"kubernetes.io/projected/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-kube-api-access-hhw9v\") on node \"crc\" DevicePath \"\"" Mar 10 17:00:03 crc kubenswrapper[4749]: I0310 17:00:03.646465 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:00:04 crc kubenswrapper[4749]: I0310 17:00:04.198124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" event={"ID":"7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6","Type":"ContainerDied","Data":"2ea92a9a5a1d19ca244a7b185c4c53b53089254373252ff05e9c27b3d14d0f5b"} Mar 10 17:00:04 crc kubenswrapper[4749]: I0310 17:00:04.198168 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea92a9a5a1d19ca244a7b185c4c53b53089254373252ff05e9c27b3d14d0f5b" Mar 10 17:00:04 crc kubenswrapper[4749]: I0310 17:00:04.198171 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj" Mar 10 17:00:04 crc kubenswrapper[4749]: I0310 17:00:04.544543 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x"] Mar 10 17:00:04 crc kubenswrapper[4749]: I0310 17:00:04.552438 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552655-8q95x"] Mar 10 17:00:05 crc kubenswrapper[4749]: I0310 17:00:05.615188 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a0be48-6c5a-4831-8371-6f60a1250eaa" path="/var/lib/kubelet/pods/b3a0be48-6c5a-4831-8371-6f60a1250eaa/volumes" Mar 10 17:00:06 crc kubenswrapper[4749]: I0310 17:00:06.216640 4749 generic.go:334] "Generic (PLEG): container finished" podID="a4a50fbf-1816-4d0b-b32a-0ea4842556cc" containerID="739d1249f9bcd35fe5bc32c73ad3e12c272f652ec2a2b467a1a9419c8abfa035" exitCode=0 Mar 10 17:00:06 crc kubenswrapper[4749]: I0310 17:00:06.216709 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552700-mspvw" event={"ID":"a4a50fbf-1816-4d0b-b32a-0ea4842556cc","Type":"ContainerDied","Data":"739d1249f9bcd35fe5bc32c73ad3e12c272f652ec2a2b467a1a9419c8abfa035"} Mar 10 17:00:07 crc kubenswrapper[4749]: I0310 17:00:07.497002 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:07 crc kubenswrapper[4749]: I0310 17:00:07.502532 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4rvm\" (UniqueName: \"kubernetes.io/projected/a4a50fbf-1816-4d0b-b32a-0ea4842556cc-kube-api-access-c4rvm\") pod \"a4a50fbf-1816-4d0b-b32a-0ea4842556cc\" (UID: \"a4a50fbf-1816-4d0b-b32a-0ea4842556cc\") " Mar 10 17:00:07 crc kubenswrapper[4749]: I0310 17:00:07.508824 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a50fbf-1816-4d0b-b32a-0ea4842556cc-kube-api-access-c4rvm" (OuterVolumeSpecName: "kube-api-access-c4rvm") pod "a4a50fbf-1816-4d0b-b32a-0ea4842556cc" (UID: "a4a50fbf-1816-4d0b-b32a-0ea4842556cc"). InnerVolumeSpecName "kube-api-access-c4rvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:00:07 crc kubenswrapper[4749]: I0310 17:00:07.603701 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4rvm\" (UniqueName: \"kubernetes.io/projected/a4a50fbf-1816-4d0b-b32a-0ea4842556cc-kube-api-access-c4rvm\") on node \"crc\" DevicePath \"\"" Mar 10 17:00:08 crc kubenswrapper[4749]: I0310 17:00:08.252630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552700-mspvw" event={"ID":"a4a50fbf-1816-4d0b-b32a-0ea4842556cc","Type":"ContainerDied","Data":"4eb589f927137920358a63ca158584a297198a3db6fe86e160f4999bbd84eb26"} Mar 10 17:00:08 crc kubenswrapper[4749]: I0310 17:00:08.252674 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eb589f927137920358a63ca158584a297198a3db6fe86e160f4999bbd84eb26" Mar 10 17:00:08 crc kubenswrapper[4749]: I0310 17:00:08.252680 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552700-mspvw" Mar 10 17:00:08 crc kubenswrapper[4749]: I0310 17:00:08.559199 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552694-px97l"] Mar 10 17:00:08 crc kubenswrapper[4749]: I0310 17:00:08.565966 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552694-px97l"] Mar 10 17:00:09 crc kubenswrapper[4749]: I0310 17:00:09.624808 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45251be-2727-494c-aacc-4695243ad22c" path="/var/lib/kubelet/pods/a45251be-2727-494c-aacc-4695243ad22c/volumes" Mar 10 17:00:26 crc kubenswrapper[4749]: I0310 17:00:26.187794 4749 scope.go:117] "RemoveContainer" containerID="2acd43e99cd8916a09584e0eeba6ea31e943628202fecc65862ef0a9ae028027" Mar 10 17:00:26 crc kubenswrapper[4749]: I0310 17:00:26.242789 4749 scope.go:117] "RemoveContainer" containerID="6a54d61e01f2e24062754f111bb8a89a06d95070350664872098f58aad28fe0b" Mar 10 17:01:20 crc kubenswrapper[4749]: I0310 17:01:20.980811 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:01:20 crc kubenswrapper[4749]: I0310 17:01:20.981617 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.211349 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7dc5f"] Mar 10 17:01:40 crc kubenswrapper[4749]: E0310 17:01:40.212023 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a50fbf-1816-4d0b-b32a-0ea4842556cc" containerName="oc" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.212035 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a50fbf-1816-4d0b-b32a-0ea4842556cc" containerName="oc" Mar 10 17:01:40 crc kubenswrapper[4749]: E0310 17:01:40.212060 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" containerName="collect-profiles" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.212066 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" containerName="collect-profiles" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.212201 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a50fbf-1816-4d0b-b32a-0ea4842556cc" containerName="oc" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.212215 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" containerName="collect-profiles" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.213105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.230493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7dc5f"] Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.379275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5lt\" (UniqueName: \"kubernetes.io/projected/d7da43f3-e0a9-4954-ba2c-1ab44a570246-kube-api-access-bh5lt\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.379871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-utilities\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.379928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-catalog-content\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.481766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-utilities\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.481832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-catalog-content\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.481896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5lt\" (UniqueName: \"kubernetes.io/projected/d7da43f3-e0a9-4954-ba2c-1ab44a570246-kube-api-access-bh5lt\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.482465 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-utilities\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.482483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-catalog-content\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.506677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5lt\" (UniqueName: \"kubernetes.io/projected/d7da43f3-e0a9-4954-ba2c-1ab44a570246-kube-api-access-bh5lt\") pod \"community-operators-7dc5f\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:40 crc kubenswrapper[4749]: I0310 17:01:40.532989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:41 crc kubenswrapper[4749]: I0310 17:01:41.055795 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7dc5f"] Mar 10 17:01:41 crc kubenswrapper[4749]: I0310 17:01:41.929244 4749 generic.go:334] "Generic (PLEG): container finished" podID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerID="5424ef0272056a64b0a4256b639e908dbf406133840fae826946b68228aaa209" exitCode=0 Mar 10 17:01:41 crc kubenswrapper[4749]: I0310 17:01:41.929302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dc5f" event={"ID":"d7da43f3-e0a9-4954-ba2c-1ab44a570246","Type":"ContainerDied","Data":"5424ef0272056a64b0a4256b639e908dbf406133840fae826946b68228aaa209"} Mar 10 17:01:41 crc kubenswrapper[4749]: I0310 17:01:41.929611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dc5f" event={"ID":"d7da43f3-e0a9-4954-ba2c-1ab44a570246","Type":"ContainerStarted","Data":"14f1929b54c9ffb73f97d4bdef6da57af35b75bfae0a6b7be95588b64d197107"} Mar 10 17:01:43 crc kubenswrapper[4749]: I0310 17:01:43.953097 4749 generic.go:334] "Generic (PLEG): container finished" podID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerID="3880e026c4c08c4e492e53227d17ab4c240cbefd23aacd4db6f470d848b8990a" exitCode=0 Mar 10 17:01:43 crc kubenswrapper[4749]: I0310 17:01:43.953211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dc5f" event={"ID":"d7da43f3-e0a9-4954-ba2c-1ab44a570246","Type":"ContainerDied","Data":"3880e026c4c08c4e492e53227d17ab4c240cbefd23aacd4db6f470d848b8990a"} Mar 10 17:01:44 crc kubenswrapper[4749]: I0310 17:01:44.963416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dc5f" event={"ID":"d7da43f3-e0a9-4954-ba2c-1ab44a570246","Type":"ContainerStarted","Data":"a5845b5f1421bbd809a32bd9b1db87d3c5b6607104f9fb2de76768b338fc8fad"} Mar 10 17:01:44 crc kubenswrapper[4749]: I0310 17:01:44.986397 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7dc5f" podStartSLOduration=2.427964598 podStartE2EDuration="4.986355284s" podCreationTimestamp="2026-03-10 17:01:40 +0000 UTC" firstStartedPulling="2026-03-10 17:01:41.931821502 +0000 UTC m=+4399.053687189" lastFinishedPulling="2026-03-10 17:01:44.490212148 +0000 UTC m=+4401.612077875" observedRunningTime="2026-03-10 17:01:44.983875937 +0000 UTC m=+4402.105741644" watchObservedRunningTime="2026-03-10 17:01:44.986355284 +0000 UTC m=+4402.108220971" Mar 10 17:01:50 crc kubenswrapper[4749]: I0310 17:01:50.533755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:50 crc kubenswrapper[4749]: I0310 17:01:50.534151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:50 crc kubenswrapper[4749]: I0310 17:01:50.571257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:50 crc kubenswrapper[4749]: I0310 17:01:50.980946 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:01:50 crc kubenswrapper[4749]: I0310 17:01:50.981010 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:01:51 crc kubenswrapper[4749]: I0310 17:01:51.042524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:53 crc kubenswrapper[4749]: I0310 17:01:53.711155 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7dc5f"] Mar 10 17:01:53 crc kubenswrapper[4749]: I0310 17:01:53.711721 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7dc5f" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="registry-server" containerID="cri-o://a5845b5f1421bbd809a32bd9b1db87d3c5b6607104f9fb2de76768b338fc8fad" gracePeriod=2 Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.032418 4749 generic.go:334] "Generic (PLEG): container finished" podID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerID="a5845b5f1421bbd809a32bd9b1db87d3c5b6607104f9fb2de76768b338fc8fad" exitCode=0 Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.032494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dc5f" event={"ID":"d7da43f3-e0a9-4954-ba2c-1ab44a570246","Type":"ContainerDied","Data":"a5845b5f1421bbd809a32bd9b1db87d3c5b6607104f9fb2de76768b338fc8fad"} Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.183191 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.352772 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh5lt\" (UniqueName: \"kubernetes.io/projected/d7da43f3-e0a9-4954-ba2c-1ab44a570246-kube-api-access-bh5lt\") pod \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.352960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-catalog-content\") pod \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.353042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-utilities\") pod \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\" (UID: \"d7da43f3-e0a9-4954-ba2c-1ab44a570246\") " Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.354081 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-utilities" (OuterVolumeSpecName: "utilities") pod "d7da43f3-e0a9-4954-ba2c-1ab44a570246" (UID: "d7da43f3-e0a9-4954-ba2c-1ab44a570246"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.359134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7da43f3-e0a9-4954-ba2c-1ab44a570246-kube-api-access-bh5lt" (OuterVolumeSpecName: "kube-api-access-bh5lt") pod "d7da43f3-e0a9-4954-ba2c-1ab44a570246" (UID: "d7da43f3-e0a9-4954-ba2c-1ab44a570246"). InnerVolumeSpecName "kube-api-access-bh5lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.456556 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh5lt\" (UniqueName: \"kubernetes.io/projected/d7da43f3-e0a9-4954-ba2c-1ab44a570246-kube-api-access-bh5lt\") on node \"crc\" DevicePath \"\"" Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.456634 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.527463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7da43f3-e0a9-4954-ba2c-1ab44a570246" (UID: "d7da43f3-e0a9-4954-ba2c-1ab44a570246"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:01:54 crc kubenswrapper[4749]: I0310 17:01:54.558851 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7da43f3-e0a9-4954-ba2c-1ab44a570246-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.042765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7dc5f" event={"ID":"d7da43f3-e0a9-4954-ba2c-1ab44a570246","Type":"ContainerDied","Data":"14f1929b54c9ffb73f97d4bdef6da57af35b75bfae0a6b7be95588b64d197107"} Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.042833 4749 scope.go:117] "RemoveContainer" containerID="a5845b5f1421bbd809a32bd9b1db87d3c5b6607104f9fb2de76768b338fc8fad" Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.042831 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7dc5f" Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.068315 4749 scope.go:117] "RemoveContainer" containerID="3880e026c4c08c4e492e53227d17ab4c240cbefd23aacd4db6f470d848b8990a" Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.096206 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7dc5f"] Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.103442 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7dc5f"] Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.130556 4749 scope.go:117] "RemoveContainer" containerID="5424ef0272056a64b0a4256b639e908dbf406133840fae826946b68228aaa209" Mar 10 17:01:55 crc kubenswrapper[4749]: I0310 17:01:55.617200 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" path="/var/lib/kubelet/pods/d7da43f3-e0a9-4954-ba2c-1ab44a570246/volumes" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.146125 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552702-rnb2h"] Mar 10 17:02:00 crc kubenswrapper[4749]: E0310 17:02:00.146727 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="registry-server" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.146744 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="registry-server" Mar 10 17:02:00 crc kubenswrapper[4749]: E0310 17:02:00.146760 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="extract-content" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.146766 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="extract-content" Mar 10 17:02:00 crc kubenswrapper[4749]: E0310 17:02:00.146782 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="extract-utilities" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.146788 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="extract-utilities" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.146950 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7da43f3-e0a9-4954-ba2c-1ab44a570246" containerName="registry-server" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.147394 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.150279 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.150329 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.150474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.165943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552702-rnb2h"] Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.243257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzcj9\" (UniqueName: \"kubernetes.io/projected/f2eca018-61af-438e-aae9-c97ad72dfaa7-kube-api-access-dzcj9\") pod \"auto-csr-approver-29552702-rnb2h\" (UID: \"f2eca018-61af-438e-aae9-c97ad72dfaa7\") " pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.344048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzcj9\" (UniqueName: \"kubernetes.io/projected/f2eca018-61af-438e-aae9-c97ad72dfaa7-kube-api-access-dzcj9\") pod \"auto-csr-approver-29552702-rnb2h\" (UID: \"f2eca018-61af-438e-aae9-c97ad72dfaa7\") " pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.371909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzcj9\" (UniqueName: \"kubernetes.io/projected/f2eca018-61af-438e-aae9-c97ad72dfaa7-kube-api-access-dzcj9\") pod \"auto-csr-approver-29552702-rnb2h\" (UID: \"f2eca018-61af-438e-aae9-c97ad72dfaa7\") " pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.481446 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:00 crc kubenswrapper[4749]: I0310 17:02:00.929627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552702-rnb2h"] Mar 10 17:02:01 crc kubenswrapper[4749]: I0310 17:02:01.132355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" event={"ID":"f2eca018-61af-438e-aae9-c97ad72dfaa7","Type":"ContainerStarted","Data":"d8e6ea4678b1af5a0ff57dde8f01cd196e03ef9c89d6a5ce7d55cca9bd1677d0"} Mar 10 17:02:03 crc kubenswrapper[4749]: I0310 17:02:03.151755 4749 generic.go:334] "Generic (PLEG): container finished" podID="f2eca018-61af-438e-aae9-c97ad72dfaa7" containerID="4d6757f1bee9f6371b1943f452d32501524f8ed65310c7d683a3e7d1b8b8df50" exitCode=0 Mar 10 17:02:03 crc kubenswrapper[4749]: I0310 17:02:03.151900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" event={"ID":"f2eca018-61af-438e-aae9-c97ad72dfaa7","Type":"ContainerDied","Data":"4d6757f1bee9f6371b1943f452d32501524f8ed65310c7d683a3e7d1b8b8df50"} Mar 10 17:02:04 crc kubenswrapper[4749]: I0310 17:02:04.483539 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:04 crc kubenswrapper[4749]: I0310 17:02:04.609253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzcj9\" (UniqueName: \"kubernetes.io/projected/f2eca018-61af-438e-aae9-c97ad72dfaa7-kube-api-access-dzcj9\") pod \"f2eca018-61af-438e-aae9-c97ad72dfaa7\" (UID: \"f2eca018-61af-438e-aae9-c97ad72dfaa7\") " Mar 10 17:02:04 crc kubenswrapper[4749]: I0310 17:02:04.614311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eca018-61af-438e-aae9-c97ad72dfaa7-kube-api-access-dzcj9" (OuterVolumeSpecName: "kube-api-access-dzcj9") pod "f2eca018-61af-438e-aae9-c97ad72dfaa7" (UID: "f2eca018-61af-438e-aae9-c97ad72dfaa7"). InnerVolumeSpecName "kube-api-access-dzcj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:02:04 crc kubenswrapper[4749]: I0310 17:02:04.712070 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzcj9\" (UniqueName: \"kubernetes.io/projected/f2eca018-61af-438e-aae9-c97ad72dfaa7-kube-api-access-dzcj9\") on node \"crc\" DevicePath \"\"" Mar 10 17:02:05 crc kubenswrapper[4749]: I0310 17:02:05.171363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" event={"ID":"f2eca018-61af-438e-aae9-c97ad72dfaa7","Type":"ContainerDied","Data":"d8e6ea4678b1af5a0ff57dde8f01cd196e03ef9c89d6a5ce7d55cca9bd1677d0"} Mar 10 17:02:05 crc kubenswrapper[4749]: I0310 17:02:05.171712 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8e6ea4678b1af5a0ff57dde8f01cd196e03ef9c89d6a5ce7d55cca9bd1677d0" Mar 10 17:02:05 crc kubenswrapper[4749]: I0310 17:02:05.171463 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552702-rnb2h" Mar 10 17:02:05 crc kubenswrapper[4749]: I0310 17:02:05.562621 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552696-d876x"] Mar 10 17:02:05 crc kubenswrapper[4749]: I0310 17:02:05.568789 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552696-d876x"] Mar 10 17:02:05 crc kubenswrapper[4749]: I0310 17:02:05.617780 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafef8ea-7c3f-485b-b796-6e93a47117a7" path="/var/lib/kubelet/pods/cafef8ea-7c3f-485b-b796-6e93a47117a7/volumes" Mar 10 17:02:20 crc kubenswrapper[4749]: I0310 17:02:20.981065 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:02:20 crc kubenswrapper[4749]: I0310 17:02:20.981688 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:02:20 crc kubenswrapper[4749]: I0310 17:02:20.981735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:02:20 crc kubenswrapper[4749]: I0310 17:02:20.982316 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50143155b07c28aad6ceff4991afedb9e3d0e4ace9db1b7305fb156222ab014a"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:02:20 crc kubenswrapper[4749]: I0310 17:02:20.982408 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://50143155b07c28aad6ceff4991afedb9e3d0e4ace9db1b7305fb156222ab014a" gracePeriod=600 Mar 10 17:02:21 crc kubenswrapper[4749]: I0310 17:02:21.308736 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="50143155b07c28aad6ceff4991afedb9e3d0e4ace9db1b7305fb156222ab014a" exitCode=0 Mar 10 17:02:21 crc kubenswrapper[4749]: I0310 17:02:21.308826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"50143155b07c28aad6ceff4991afedb9e3d0e4ace9db1b7305fb156222ab014a"} Mar 10 17:02:21 crc kubenswrapper[4749]: I0310 17:02:21.309268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c"} Mar 10 17:02:21 crc kubenswrapper[4749]: I0310 17:02:21.309301 4749 scope.go:117] "RemoveContainer" containerID="70db06265fe0a0b47a564d0a6e136814005f637e427ce3da5d655f6dfac3875b" Mar 10 17:02:26 crc kubenswrapper[4749]: I0310 17:02:26.332844 4749 scope.go:117] "RemoveContainer" containerID="46a20d137b3df3bbe3c383387cf5439471e3b4f73947010687a855eb3036eccb" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.313582 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8fkx"] Mar 10 17:02:44 crc kubenswrapper[4749]: E0310 17:02:44.314501 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eca018-61af-438e-aae9-c97ad72dfaa7" containerName="oc" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.314515 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eca018-61af-438e-aae9-c97ad72dfaa7" containerName="oc" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.314697 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2eca018-61af-438e-aae9-c97ad72dfaa7" containerName="oc" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.315907 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.343471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8fkx"] Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.464610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-utilities\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.464663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk7xl\" (UniqueName: \"kubernetes.io/projected/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-kube-api-access-kk7xl\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.464691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-catalog-content\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.566453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-utilities\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.566525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk7xl\" (UniqueName: \"kubernetes.io/projected/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-kube-api-access-kk7xl\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.566565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-catalog-content\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.567262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-utilities\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.567282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-catalog-content\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.590646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk7xl\" (UniqueName: \"kubernetes.io/projected/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-kube-api-access-kk7xl\") pod \"certified-operators-t8fkx\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:44 crc kubenswrapper[4749]: I0310 17:02:44.641647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:45 crc kubenswrapper[4749]: I0310 17:02:45.126062 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8fkx"] Mar 10 17:02:45 crc kubenswrapper[4749]: I0310 17:02:45.509701 4749 generic.go:334] "Generic (PLEG): container finished" podID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerID="975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed" exitCode=0 Mar 10 17:02:45 crc kubenswrapper[4749]: I0310 17:02:45.509765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerDied","Data":"975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed"} Mar 10 17:02:45 crc kubenswrapper[4749]: I0310 17:02:45.509846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerStarted","Data":"b75a4a48089a6224fe14d548b980b32502e44833d237dc6b51d57b65025558bb"} Mar 10 17:02:46 crc kubenswrapper[4749]: I0310 17:02:46.517927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerStarted","Data":"8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604"} Mar 10 17:02:47 crc kubenswrapper[4749]: I0310 17:02:47.527312 4749 generic.go:334] "Generic (PLEG): container finished" podID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerID="8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604" exitCode=0 Mar 10 17:02:47 crc kubenswrapper[4749]: I0310 17:02:47.527424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerDied","Data":"8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604"} Mar 10 17:02:48 crc kubenswrapper[4749]: I0310 17:02:48.537541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerStarted","Data":"870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e"} Mar 10 17:02:48 crc kubenswrapper[4749]: I0310 17:02:48.560922 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8fkx" podStartSLOduration=1.9493054349999999 podStartE2EDuration="4.56090452s" podCreationTimestamp="2026-03-10 17:02:44 +0000 UTC" firstStartedPulling="2026-03-10 17:02:45.511459996 +0000 UTC m=+4462.633325683" lastFinishedPulling="2026-03-10 17:02:48.123059041 +0000 UTC m=+4465.244924768" observedRunningTime="2026-03-10 17:02:48.556758977 +0000 UTC m=+4465.678624674" watchObservedRunningTime="2026-03-10 17:02:48.56090452 +0000 UTC m=+4465.682770207" Mar 10 17:02:54 crc kubenswrapper[4749]: I0310 17:02:54.642202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:54 crc kubenswrapper[4749]: I0310 17:02:54.642598 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:54 crc kubenswrapper[4749]: I0310 17:02:54.689720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:55 crc kubenswrapper[4749]: I0310 17:02:55.637464 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:55 crc kubenswrapper[4749]: I0310 17:02:55.688307 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8fkx"] Mar 10 17:02:57 crc kubenswrapper[4749]: I0310 17:02:57.602231 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t8fkx" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="registry-server" containerID="cri-o://870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e" gracePeriod=2 Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.041540 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.198820 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-utilities\") pod \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.199040 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk7xl\" (UniqueName: \"kubernetes.io/projected/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-kube-api-access-kk7xl\") pod \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.199103 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-catalog-content\") pod \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\" (UID: \"b6b7f8ac-9a2f-4bee-970c-8b52d679de40\") " Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.201826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-utilities" (OuterVolumeSpecName: "utilities") pod "b6b7f8ac-9a2f-4bee-970c-8b52d679de40" (UID: "b6b7f8ac-9a2f-4bee-970c-8b52d679de40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.204912 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-kube-api-access-kk7xl" (OuterVolumeSpecName: "kube-api-access-kk7xl") pod "b6b7f8ac-9a2f-4bee-970c-8b52d679de40" (UID: "b6b7f8ac-9a2f-4bee-970c-8b52d679de40"). InnerVolumeSpecName "kube-api-access-kk7xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.271013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6b7f8ac-9a2f-4bee-970c-8b52d679de40" (UID: "b6b7f8ac-9a2f-4bee-970c-8b52d679de40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.300841 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.301448 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk7xl\" (UniqueName: \"kubernetes.io/projected/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-kube-api-access-kk7xl\") on node \"crc\" DevicePath \"\"" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.301473 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b7f8ac-9a2f-4bee-970c-8b52d679de40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.613408 4749 generic.go:334] "Generic (PLEG): container finished" podID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerID="870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e" exitCode=0 Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.613481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerDied","Data":"870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e"} Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.613526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8fkx" event={"ID":"b6b7f8ac-9a2f-4bee-970c-8b52d679de40","Type":"ContainerDied","Data":"b75a4a48089a6224fe14d548b980b32502e44833d237dc6b51d57b65025558bb"} Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.613554 4749 scope.go:117] "RemoveContainer" containerID="870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.613733 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8fkx" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.636678 4749 scope.go:117] "RemoveContainer" containerID="8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.657874 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8fkx"] Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.663155 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t8fkx"] Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.673399 4749 scope.go:117] "RemoveContainer" containerID="975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.688578 4749 scope.go:117] "RemoveContainer" containerID="870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e" Mar 10 17:02:58 crc kubenswrapper[4749]: E0310 17:02:58.689165 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e\": container with ID starting with 870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e not found: ID does not exist" containerID="870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.689203 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e"} err="failed to get container status \"870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e\": rpc error: code = NotFound desc = could not find container \"870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e\": container with ID starting with 870b7be7623d526d06da9b3982d284bdac3812d410effaf1f9f703edccb45c4e not found: ID does not exist" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.689228 4749 scope.go:117] "RemoveContainer" containerID="8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604" Mar 10 17:02:58 crc kubenswrapper[4749]: E0310 17:02:58.689719 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604\": container with ID starting with 8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604 not found: ID does not exist" containerID="8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.689744 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604"} err="failed to get container status \"8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604\": rpc error: code = NotFound desc = could not find container \"8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604\": container with ID starting with 8608469ed753322d0bc3cb88ea3c5ccfdf372ec494ef4c8e67757e42e21e4604 not found: ID does not exist" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.689760 4749 scope.go:117] "RemoveContainer" containerID="975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed" Mar 10 17:02:58 crc kubenswrapper[4749]: E0310 17:02:58.690015 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed\": container with ID starting with 975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed not found: ID does not exist" containerID="975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed" Mar 10 17:02:58 crc kubenswrapper[4749]: I0310 17:02:58.690039 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed"} err="failed to get container status \"975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed\": rpc error: code = NotFound desc = could not find container \"975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed\": container with ID starting with 975356c368d33470a28540d8f1e7a4a8e031fec5c1167f9f17ddec820e2824ed not found: ID does not exist" Mar 10 17:02:59 crc kubenswrapper[4749]: I0310 17:02:59.616398 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" path="/var/lib/kubelet/pods/b6b7f8ac-9a2f-4bee-970c-8b52d679de40/volumes" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.138902 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552704-rtrmd"] Mar 10 17:04:00 crc kubenswrapper[4749]: E0310 17:04:00.139836 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="registry-server" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.139933 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="registry-server" Mar 10 17:04:00 crc kubenswrapper[4749]: E0310 17:04:00.139972 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="extract-utilities" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.139982 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="extract-utilities" Mar 10 17:04:00 crc kubenswrapper[4749]: E0310 17:04:00.139993 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="extract-content" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.140003 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="extract-content" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.140214 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b7f8ac-9a2f-4bee-970c-8b52d679de40" containerName="registry-server" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.140797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.143042 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.143208 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.146248 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.147257 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552704-rtrmd"] Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.287683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4phxn\" (UniqueName: \"kubernetes.io/projected/4e869a60-e444-417b-9256-e63fe1b7cfa0-kube-api-access-4phxn\") pod \"auto-csr-approver-29552704-rtrmd\" (UID: \"4e869a60-e444-417b-9256-e63fe1b7cfa0\") " pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.389388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4phxn\" (UniqueName: \"kubernetes.io/projected/4e869a60-e444-417b-9256-e63fe1b7cfa0-kube-api-access-4phxn\") pod \"auto-csr-approver-29552704-rtrmd\" (UID: \"4e869a60-e444-417b-9256-e63fe1b7cfa0\") " pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.409521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4phxn\" (UniqueName: \"kubernetes.io/projected/4e869a60-e444-417b-9256-e63fe1b7cfa0-kube-api-access-4phxn\") pod \"auto-csr-approver-29552704-rtrmd\" (UID: \"4e869a60-e444-417b-9256-e63fe1b7cfa0\") " pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.461641 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:00 crc kubenswrapper[4749]: I0310 17:04:00.921227 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552704-rtrmd"] Mar 10 17:04:01 crc kubenswrapper[4749]: I0310 17:04:01.522828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" event={"ID":"4e869a60-e444-417b-9256-e63fe1b7cfa0","Type":"ContainerStarted","Data":"8bf11c306d60813289448ef1875ed44946e27c29200380a846bb6ae7667a0cab"} Mar 10 17:04:03 crc kubenswrapper[4749]: I0310 17:04:03.542898 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e869a60-e444-417b-9256-e63fe1b7cfa0" containerID="3cc7a88cdab6ba4db24b312ed2ad280070807e43ddf3934e2da2d72697432650" exitCode=0 Mar 10 17:04:03 crc kubenswrapper[4749]: I0310 17:04:03.542981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" event={"ID":"4e869a60-e444-417b-9256-e63fe1b7cfa0","Type":"ContainerDied","Data":"3cc7a88cdab6ba4db24b312ed2ad280070807e43ddf3934e2da2d72697432650"} Mar 10 17:04:04 crc kubenswrapper[4749]: I0310 17:04:04.829817 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:04 crc kubenswrapper[4749]: I0310 17:04:04.952220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4phxn\" (UniqueName: \"kubernetes.io/projected/4e869a60-e444-417b-9256-e63fe1b7cfa0-kube-api-access-4phxn\") pod \"4e869a60-e444-417b-9256-e63fe1b7cfa0\" (UID: \"4e869a60-e444-417b-9256-e63fe1b7cfa0\") " Mar 10 17:04:04 crc kubenswrapper[4749]: I0310 17:04:04.959796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e869a60-e444-417b-9256-e63fe1b7cfa0-kube-api-access-4phxn" (OuterVolumeSpecName: "kube-api-access-4phxn") pod "4e869a60-e444-417b-9256-e63fe1b7cfa0" (UID: "4e869a60-e444-417b-9256-e63fe1b7cfa0"). InnerVolumeSpecName "kube-api-access-4phxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:04:05 crc kubenswrapper[4749]: I0310 17:04:05.053207 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4phxn\" (UniqueName: \"kubernetes.io/projected/4e869a60-e444-417b-9256-e63fe1b7cfa0-kube-api-access-4phxn\") on node \"crc\" DevicePath \"\"" Mar 10 17:04:05 crc kubenswrapper[4749]: I0310 17:04:05.562064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" event={"ID":"4e869a60-e444-417b-9256-e63fe1b7cfa0","Type":"ContainerDied","Data":"8bf11c306d60813289448ef1875ed44946e27c29200380a846bb6ae7667a0cab"} Mar 10 17:04:05 crc kubenswrapper[4749]: I0310 17:04:05.562108 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf11c306d60813289448ef1875ed44946e27c29200380a846bb6ae7667a0cab" Mar 10 17:04:05 crc kubenswrapper[4749]: I0310 17:04:05.562169 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552704-rtrmd" Mar 10 17:04:05 crc kubenswrapper[4749]: I0310 17:04:05.902226 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552698-l9z46"] Mar 10 17:04:05 crc kubenswrapper[4749]: I0310 17:04:05.910018 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552698-l9z46"] Mar 10 17:04:07 crc kubenswrapper[4749]: I0310 17:04:07.618361 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b089428-11c1-44f6-bd22-dd3a2da969e2" path="/var/lib/kubelet/pods/7b089428-11c1-44f6-bd22-dd3a2da969e2/volumes" Mar 10 17:04:26 crc kubenswrapper[4749]: I0310 17:04:26.484739 4749 scope.go:117] "RemoveContainer" containerID="e57eb39468f4dcc29df16e35709e82753763f3b2d3147c6d885789d5440727b7" Mar 10 17:04:50 crc kubenswrapper[4749]: I0310 17:04:50.981072 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:04:50 crc kubenswrapper[4749]: I0310 17:04:50.981823 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:05:20 crc kubenswrapper[4749]: I0310 17:05:20.980322 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:05:20 crc kubenswrapper[4749]: I0310 17:05:20.980849 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:05:50 crc kubenswrapper[4749]: I0310 17:05:50.980498 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:05:50 crc kubenswrapper[4749]: I0310 17:05:50.981158 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:05:50 crc kubenswrapper[4749]: I0310 17:05:50.981220 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:05:50 crc kubenswrapper[4749]: I0310 17:05:50.982064 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:05:50 crc kubenswrapper[4749]: I0310 17:05:50.982162 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" gracePeriod=600 Mar 10 17:05:51 crc kubenswrapper[4749]: E0310 17:05:51.129918 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:05:51 crc kubenswrapper[4749]: I0310 17:05:51.649981 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" exitCode=0 Mar 10 17:05:51 crc kubenswrapper[4749]: I0310 17:05:51.650184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c"} Mar 10 17:05:51 crc kubenswrapper[4749]: I0310 17:05:51.650321 4749 scope.go:117] "RemoveContainer" containerID="50143155b07c28aad6ceff4991afedb9e3d0e4ace9db1b7305fb156222ab014a" Mar 10 17:05:51 crc kubenswrapper[4749]: I0310 17:05:51.650880 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:05:51 crc kubenswrapper[4749]: E0310 17:05:51.651078 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.155778 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552706-m9rpd"] Mar 10 17:06:00 crc kubenswrapper[4749]: E0310 17:06:00.156765 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e869a60-e444-417b-9256-e63fe1b7cfa0" containerName="oc" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.156784 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e869a60-e444-417b-9256-e63fe1b7cfa0" containerName="oc" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.156990 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e869a60-e444-417b-9256-e63fe1b7cfa0" containerName="oc" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.157722 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.164340 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.164651 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.164784 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.184943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552706-m9rpd"] Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.305630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5h9z\" (UniqueName: \"kubernetes.io/projected/70c25b49-4263-4673-b2eb-0b423da84861-kube-api-access-l5h9z\") pod \"auto-csr-approver-29552706-m9rpd\" (UID: \"70c25b49-4263-4673-b2eb-0b423da84861\") " pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.406603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5h9z\" (UniqueName: \"kubernetes.io/projected/70c25b49-4263-4673-b2eb-0b423da84861-kube-api-access-l5h9z\") pod \"auto-csr-approver-29552706-m9rpd\" (UID: \"70c25b49-4263-4673-b2eb-0b423da84861\") " pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.567808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5h9z\" (UniqueName: \"kubernetes.io/projected/70c25b49-4263-4673-b2eb-0b423da84861-kube-api-access-l5h9z\") pod \"auto-csr-approver-29552706-m9rpd\" (UID: \"70c25b49-4263-4673-b2eb-0b423da84861\") " pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:00 crc kubenswrapper[4749]: I0310 17:06:00.788124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:01 crc kubenswrapper[4749]: I0310 17:06:01.297723 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552706-m9rpd"] Mar 10 17:06:01 crc kubenswrapper[4749]: I0310 17:06:01.314639 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:06:01 crc kubenswrapper[4749]: I0310 17:06:01.731753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" event={"ID":"70c25b49-4263-4673-b2eb-0b423da84861","Type":"ContainerStarted","Data":"202a2f0facf0c1cf7d8f03cba2f743a8fdf3a8af3e09c51fd9a303e254d08ddf"} Mar 10 17:06:03 crc kubenswrapper[4749]: I0310 17:06:03.747361 4749 generic.go:334] "Generic (PLEG): container finished" podID="70c25b49-4263-4673-b2eb-0b423da84861" containerID="8549f4074d1a6454cd0c68a1c87f4942bba984a093c352256b5e85c28908bff0" exitCode=0 Mar 10 17:06:03 crc kubenswrapper[4749]: I0310 17:06:03.747491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" event={"ID":"70c25b49-4263-4673-b2eb-0b423da84861","Type":"ContainerDied","Data":"8549f4074d1a6454cd0c68a1c87f4942bba984a093c352256b5e85c28908bff0"} Mar 10 17:06:04 crc kubenswrapper[4749]: I0310 17:06:04.608112 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:06:04 crc kubenswrapper[4749]: E0310 17:06:04.608658 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.124800 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.306107 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5h9z\" (UniqueName: \"kubernetes.io/projected/70c25b49-4263-4673-b2eb-0b423da84861-kube-api-access-l5h9z\") pod \"70c25b49-4263-4673-b2eb-0b423da84861\" (UID: \"70c25b49-4263-4673-b2eb-0b423da84861\") " Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.311907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c25b49-4263-4673-b2eb-0b423da84861-kube-api-access-l5h9z" (OuterVolumeSpecName: "kube-api-access-l5h9z") pod "70c25b49-4263-4673-b2eb-0b423da84861" (UID: "70c25b49-4263-4673-b2eb-0b423da84861"). InnerVolumeSpecName "kube-api-access-l5h9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.407741 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5h9z\" (UniqueName: \"kubernetes.io/projected/70c25b49-4263-4673-b2eb-0b423da84861-kube-api-access-l5h9z\") on node \"crc\" DevicePath \"\"" Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.763996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" event={"ID":"70c25b49-4263-4673-b2eb-0b423da84861","Type":"ContainerDied","Data":"202a2f0facf0c1cf7d8f03cba2f743a8fdf3a8af3e09c51fd9a303e254d08ddf"} Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.764032 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552706-m9rpd" Mar 10 17:06:05 crc kubenswrapper[4749]: I0310 17:06:05.764040 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="202a2f0facf0c1cf7d8f03cba2f743a8fdf3a8af3e09c51fd9a303e254d08ddf" Mar 10 17:06:06 crc kubenswrapper[4749]: I0310 17:06:06.207224 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552700-mspvw"] Mar 10 17:06:06 crc kubenswrapper[4749]: I0310 17:06:06.213494 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552700-mspvw"] Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.326614 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ljh4j"] Mar 10 17:06:07 crc kubenswrapper[4749]: E0310 17:06:07.328522 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c25b49-4263-4673-b2eb-0b423da84861" containerName="oc" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.328632 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c25b49-4263-4673-b2eb-0b423da84861" containerName="oc" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.330108 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c25b49-4263-4673-b2eb-0b423da84861" containerName="oc" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.344057 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ljh4j"] Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.344395 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.441142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwrrb\" (UniqueName: \"kubernetes.io/projected/6c6f5349-293c-4f89-a55c-688b8c3010b0-kube-api-access-pwrrb\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.441196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-catalog-content\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.441267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-utilities\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.541939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-utilities\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.542022 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwrrb\" (UniqueName: \"kubernetes.io/projected/6c6f5349-293c-4f89-a55c-688b8c3010b0-kube-api-access-pwrrb\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.542059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-catalog-content\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.542619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-utilities\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.542626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-catalog-content\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.564122 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwrrb\" (UniqueName: \"kubernetes.io/projected/6c6f5349-293c-4f89-a55c-688b8c3010b0-kube-api-access-pwrrb\") pod \"redhat-operators-ljh4j\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.617017 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a50fbf-1816-4d0b-b32a-0ea4842556cc" path="/var/lib/kubelet/pods/a4a50fbf-1816-4d0b-b32a-0ea4842556cc/volumes" Mar 10 17:06:07 crc kubenswrapper[4749]: I0310 17:06:07.720666 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:08 crc kubenswrapper[4749]: I0310 17:06:08.213772 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ljh4j"] Mar 10 17:06:08 crc kubenswrapper[4749]: I0310 17:06:08.785996 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerID="d2d08ec3ef080f9c6ea981ad105f5489c83bec205f0e34f8ac68a66af8c70a29" exitCode=0 Mar 10 17:06:08 crc kubenswrapper[4749]: I0310 17:06:08.786109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljh4j" event={"ID":"6c6f5349-293c-4f89-a55c-688b8c3010b0","Type":"ContainerDied","Data":"d2d08ec3ef080f9c6ea981ad105f5489c83bec205f0e34f8ac68a66af8c70a29"} Mar 10 17:06:08 crc kubenswrapper[4749]: I0310 17:06:08.786272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljh4j" event={"ID":"6c6f5349-293c-4f89-a55c-688b8c3010b0","Type":"ContainerStarted","Data":"907b4a32bfa2e5d6d0e2cf538596cc629244c68dd8a9bf9f9dc1a03a706f1201"} Mar 10 17:06:10 crc kubenswrapper[4749]: I0310 17:06:10.810153 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerID="228ad17ace37ebb4ca0ed1f3020d1d81e6b9b91c2341430400f1ae36b05a6dde" exitCode=0 Mar 10 17:06:10 crc kubenswrapper[4749]: I0310 17:06:10.810226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljh4j" event={"ID":"6c6f5349-293c-4f89-a55c-688b8c3010b0","Type":"ContainerDied","Data":"228ad17ace37ebb4ca0ed1f3020d1d81e6b9b91c2341430400f1ae36b05a6dde"} Mar 10 17:06:11 crc kubenswrapper[4749]: I0310 17:06:11.822463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljh4j" event={"ID":"6c6f5349-293c-4f89-a55c-688b8c3010b0","Type":"ContainerStarted","Data":"13fb1c510bf963fe5f9c40a17fbbd4977d6281118ef74126b6d297ca9870b6e2"} Mar 10 17:06:11 crc kubenswrapper[4749]: I0310 17:06:11.845707 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ljh4j" podStartSLOduration=2.417753463 podStartE2EDuration="4.845682977s" podCreationTimestamp="2026-03-10 17:06:07 +0000 UTC" firstStartedPulling="2026-03-10 17:06:08.787751003 +0000 UTC m=+4665.909616690" lastFinishedPulling="2026-03-10 17:06:11.215680477 +0000 UTC m=+4668.337546204" observedRunningTime="2026-03-10 17:06:11.842558702 +0000 UTC m=+4668.964424439" watchObservedRunningTime="2026-03-10 17:06:11.845682977 +0000 UTC m=+4668.967548684" Mar 10 17:06:16 crc kubenswrapper[4749]: I0310 17:06:16.607079 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:06:16 crc kubenswrapper[4749]: E0310 17:06:16.608579 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:06:17 crc kubenswrapper[4749]: I0310 17:06:17.721235 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:17 crc kubenswrapper[4749]: I0310 17:06:17.721680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:18 crc kubenswrapper[4749]: I0310 17:06:18.781563 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ljh4j" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="registry-server" probeResult="failure" output=< Mar 10 17:06:18 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 17:06:18 crc kubenswrapper[4749]: > Mar 10 17:06:26 crc kubenswrapper[4749]: I0310 17:06:26.581903 4749 scope.go:117] "RemoveContainer" containerID="739d1249f9bcd35fe5bc32c73ad3e12c272f652ec2a2b467a1a9419c8abfa035" Mar 10 17:06:27 crc kubenswrapper[4749]: I0310 17:06:27.607191 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:06:27 crc kubenswrapper[4749]: E0310 17:06:27.608947 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:06:27 crc kubenswrapper[4749]: I0310 17:06:27.790618 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:27 crc kubenswrapper[4749]: I0310 17:06:27.853155 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:28 crc kubenswrapper[4749]: I0310 17:06:28.027967 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ljh4j"] Mar 10 17:06:28 crc kubenswrapper[4749]: I0310 17:06:28.952870 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ljh4j" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="registry-server" containerID="cri-o://13fb1c510bf963fe5f9c40a17fbbd4977d6281118ef74126b6d297ca9870b6e2" gracePeriod=2 Mar 10 17:06:29 crc kubenswrapper[4749]: I0310 17:06:29.965724 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerID="13fb1c510bf963fe5f9c40a17fbbd4977d6281118ef74126b6d297ca9870b6e2" exitCode=0 Mar 10 17:06:29 crc kubenswrapper[4749]: I0310 17:06:29.965823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljh4j" event={"ID":"6c6f5349-293c-4f89-a55c-688b8c3010b0","Type":"ContainerDied","Data":"13fb1c510bf963fe5f9c40a17fbbd4977d6281118ef74126b6d297ca9870b6e2"} Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.183810 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.352030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwrrb\" (UniqueName: \"kubernetes.io/projected/6c6f5349-293c-4f89-a55c-688b8c3010b0-kube-api-access-pwrrb\") pod \"6c6f5349-293c-4f89-a55c-688b8c3010b0\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.352181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-utilities\") pod \"6c6f5349-293c-4f89-a55c-688b8c3010b0\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.352238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-catalog-content\") pod \"6c6f5349-293c-4f89-a55c-688b8c3010b0\" (UID: \"6c6f5349-293c-4f89-a55c-688b8c3010b0\") " Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.353034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-utilities" (OuterVolumeSpecName: "utilities") pod "6c6f5349-293c-4f89-a55c-688b8c3010b0" (UID: "6c6f5349-293c-4f89-a55c-688b8c3010b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.361627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6f5349-293c-4f89-a55c-688b8c3010b0-kube-api-access-pwrrb" (OuterVolumeSpecName: "kube-api-access-pwrrb") pod "6c6f5349-293c-4f89-a55c-688b8c3010b0" (UID: "6c6f5349-293c-4f89-a55c-688b8c3010b0"). InnerVolumeSpecName "kube-api-access-pwrrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.454220 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.454268 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwrrb\" (UniqueName: \"kubernetes.io/projected/6c6f5349-293c-4f89-a55c-688b8c3010b0-kube-api-access-pwrrb\") on node \"crc\" DevicePath \"\"" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.497698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c6f5349-293c-4f89-a55c-688b8c3010b0" (UID: "6c6f5349-293c-4f89-a55c-688b8c3010b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.556586 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c6f5349-293c-4f89-a55c-688b8c3010b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.975451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljh4j" event={"ID":"6c6f5349-293c-4f89-a55c-688b8c3010b0","Type":"ContainerDied","Data":"907b4a32bfa2e5d6d0e2cf538596cc629244c68dd8a9bf9f9dc1a03a706f1201"} Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.975509 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljh4j" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.975513 4749 scope.go:117] "RemoveContainer" containerID="13fb1c510bf963fe5f9c40a17fbbd4977d6281118ef74126b6d297ca9870b6e2" Mar 10 17:06:30 crc kubenswrapper[4749]: I0310 17:06:30.999818 4749 scope.go:117] "RemoveContainer" containerID="228ad17ace37ebb4ca0ed1f3020d1d81e6b9b91c2341430400f1ae36b05a6dde" Mar 10 17:06:31 crc kubenswrapper[4749]: I0310 17:06:31.005678 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ljh4j"] Mar 10 17:06:31 crc kubenswrapper[4749]: I0310 17:06:31.012033 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ljh4j"] Mar 10 17:06:31 crc kubenswrapper[4749]: I0310 17:06:31.038901 4749 scope.go:117] "RemoveContainer" containerID="d2d08ec3ef080f9c6ea981ad105f5489c83bec205f0e34f8ac68a66af8c70a29" Mar 10 17:06:31 crc kubenswrapper[4749]: I0310 17:06:31.616483 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" path="/var/lib/kubelet/pods/6c6f5349-293c-4f89-a55c-688b8c3010b0/volumes" Mar 10 17:06:39 crc kubenswrapper[4749]: I0310 17:06:39.607250 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:06:39 crc kubenswrapper[4749]: E0310 17:06:39.608435 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:06:53 crc kubenswrapper[4749]: I0310 17:06:53.611720 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:06:53 crc kubenswrapper[4749]: E0310 17:06:53.612559 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:07:05 crc kubenswrapper[4749]: I0310 17:07:05.606569 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:07:05 crc kubenswrapper[4749]: E0310 17:07:05.607423 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:07:18 crc kubenswrapper[4749]: I0310 17:07:18.606709 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:07:18 crc kubenswrapper[4749]: E0310 17:07:18.607653 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:07:32 crc kubenswrapper[4749]: I0310 17:07:32.606762 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:07:32 crc kubenswrapper[4749]: E0310 17:07:32.607951 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:07:47 crc kubenswrapper[4749]: I0310 17:07:47.607214 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:07:47 crc kubenswrapper[4749]: E0310 17:07:47.608174 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.156882 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552708-lt2mw"] Mar 10 17:08:00 crc kubenswrapper[4749]: E0310 17:08:00.158279 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="registry-server" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.158316 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="registry-server" Mar 10 17:08:00 crc kubenswrapper[4749]: E0310 17:08:00.158362 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="extract-utilities" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.158431 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="extract-utilities" Mar 10 17:08:00 crc kubenswrapper[4749]: E0310 17:08:00.158471 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="extract-content" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.158491 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="extract-content" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.158823 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c6f5349-293c-4f89-a55c-688b8c3010b0" containerName="registry-server" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.159814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.165287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.165473 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.166041 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.169473 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552708-lt2mw"] Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.311010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwnd\" (UniqueName: \"kubernetes.io/projected/1093e396-a552-4168-a523-77c17d2f5f81-kube-api-access-9vwnd\") pod \"auto-csr-approver-29552708-lt2mw\" (UID: \"1093e396-a552-4168-a523-77c17d2f5f81\") " pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.412862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwnd\" (UniqueName: \"kubernetes.io/projected/1093e396-a552-4168-a523-77c17d2f5f81-kube-api-access-9vwnd\") pod \"auto-csr-approver-29552708-lt2mw\" (UID: \"1093e396-a552-4168-a523-77c17d2f5f81\") " pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.432999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwnd\" (UniqueName: \"kubernetes.io/projected/1093e396-a552-4168-a523-77c17d2f5f81-kube-api-access-9vwnd\") pod \"auto-csr-approver-29552708-lt2mw\" (UID: \"1093e396-a552-4168-a523-77c17d2f5f81\") " pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.490876 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.606954 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:08:00 crc kubenswrapper[4749]: E0310 17:08:00.607529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:08:00 crc kubenswrapper[4749]: I0310 17:08:00.908386 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552708-lt2mw"] Mar 10 17:08:01 crc kubenswrapper[4749]: I0310 17:08:01.771721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" event={"ID":"1093e396-a552-4168-a523-77c17d2f5f81","Type":"ContainerStarted","Data":"6daed2a9281069dee0f1b448c96b12154376c48cf5e5224eb40f0797c70ee23f"} Mar 10 17:08:02 crc kubenswrapper[4749]: I0310 17:08:02.783209 4749 generic.go:334] "Generic (PLEG): container finished" podID="1093e396-a552-4168-a523-77c17d2f5f81" containerID="8654cb0abfa992576dc7f097aa0d5c9d19b4c667ea83e12f917c66b7513ef295" exitCode=0 Mar 10 17:08:02 crc kubenswrapper[4749]: I0310 17:08:02.783552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" event={"ID":"1093e396-a552-4168-a523-77c17d2f5f81","Type":"ContainerDied","Data":"8654cb0abfa992576dc7f097aa0d5c9d19b4c667ea83e12f917c66b7513ef295"} Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.210280 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.386067 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vwnd\" (UniqueName: \"kubernetes.io/projected/1093e396-a552-4168-a523-77c17d2f5f81-kube-api-access-9vwnd\") pod \"1093e396-a552-4168-a523-77c17d2f5f81\" (UID: \"1093e396-a552-4168-a523-77c17d2f5f81\") " Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.395309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1093e396-a552-4168-a523-77c17d2f5f81-kube-api-access-9vwnd" (OuterVolumeSpecName: "kube-api-access-9vwnd") pod "1093e396-a552-4168-a523-77c17d2f5f81" (UID: "1093e396-a552-4168-a523-77c17d2f5f81"). InnerVolumeSpecName "kube-api-access-9vwnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.488684 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vwnd\" (UniqueName: \"kubernetes.io/projected/1093e396-a552-4168-a523-77c17d2f5f81-kube-api-access-9vwnd\") on node \"crc\" DevicePath \"\"" Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.805173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" event={"ID":"1093e396-a552-4168-a523-77c17d2f5f81","Type":"ContainerDied","Data":"6daed2a9281069dee0f1b448c96b12154376c48cf5e5224eb40f0797c70ee23f"} Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.805769 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6daed2a9281069dee0f1b448c96b12154376c48cf5e5224eb40f0797c70ee23f" Mar 10 17:08:04 crc kubenswrapper[4749]: I0310 17:08:04.805241 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552708-lt2mw" Mar 10 17:08:05 crc kubenswrapper[4749]: I0310 17:08:05.281734 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552702-rnb2h"] Mar 10 17:08:05 crc kubenswrapper[4749]: I0310 17:08:05.287086 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552702-rnb2h"] Mar 10 17:08:05 crc kubenswrapper[4749]: I0310 17:08:05.624183 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2eca018-61af-438e-aae9-c97ad72dfaa7" path="/var/lib/kubelet/pods/f2eca018-61af-438e-aae9-c97ad72dfaa7/volumes" Mar 10 17:08:09 crc kubenswrapper[4749]: I0310 17:08:09.964702 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bss87"] Mar 10 17:08:09 crc kubenswrapper[4749]: E0310 17:08:09.965425 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1093e396-a552-4168-a523-77c17d2f5f81" containerName="oc" Mar 10 17:08:09 crc kubenswrapper[4749]: I0310 17:08:09.965437 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1093e396-a552-4168-a523-77c17d2f5f81" containerName="oc" Mar 10 17:08:09 crc kubenswrapper[4749]: I0310 17:08:09.965565 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1093e396-a552-4168-a523-77c17d2f5f81" containerName="oc" Mar 10 17:08:09 crc kubenswrapper[4749]: I0310 17:08:09.966445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:09 crc kubenswrapper[4749]: I0310 17:08:09.992906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bss87"] Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.094421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-catalog-content\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.094504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-utilities\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.094608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpbzg\" (UniqueName: \"kubernetes.io/projected/ee003fa3-4ae9-4c90-b431-75cdd2198a06-kube-api-access-gpbzg\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.195701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-catalog-content\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.195782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-utilities\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.195840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpbzg\" (UniqueName: \"kubernetes.io/projected/ee003fa3-4ae9-4c90-b431-75cdd2198a06-kube-api-access-gpbzg\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.196473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-catalog-content\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.196505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-utilities\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.221573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpbzg\" (UniqueName: \"kubernetes.io/projected/ee003fa3-4ae9-4c90-b431-75cdd2198a06-kube-api-access-gpbzg\") pod \"redhat-marketplace-bss87\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.298415 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:10 crc kubenswrapper[4749]: I0310 17:08:10.777329 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bss87"] Mar 10 17:08:11 crc kubenswrapper[4749]: I0310 17:08:11.607279 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:08:11 crc kubenswrapper[4749]: E0310 17:08:11.608017 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:08:11 crc kubenswrapper[4749]: I0310 17:08:11.871323 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerID="faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5" exitCode=0 Mar 10 17:08:11 crc kubenswrapper[4749]: I0310 17:08:11.871424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bss87" event={"ID":"ee003fa3-4ae9-4c90-b431-75cdd2198a06","Type":"ContainerDied","Data":"faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5"} Mar 10 17:08:11 crc kubenswrapper[4749]: I0310 17:08:11.871462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bss87" event={"ID":"ee003fa3-4ae9-4c90-b431-75cdd2198a06","Type":"ContainerStarted","Data":"8304ba97b148b394ceb1b9bf575607434477d68c3187e13dc1732c2b00d04b2d"} Mar 10 17:08:26 crc kubenswrapper[4749]: I0310 17:08:26.607080 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:08:26 crc kubenswrapper[4749]: E0310 17:08:26.608210 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:08:26 crc kubenswrapper[4749]: I0310 17:08:26.718540 4749 scope.go:117] "RemoveContainer" containerID="4d6757f1bee9f6371b1943f452d32501524f8ed65310c7d683a3e7d1b8b8df50" Mar 10 17:08:40 crc kubenswrapper[4749]: I0310 17:08:40.607739 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:08:40 crc kubenswrapper[4749]: E0310 17:08:40.608470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:08:44 crc kubenswrapper[4749]: I0310 17:08:44.176282 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerID="0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9" exitCode=0 Mar 10 17:08:44 crc kubenswrapper[4749]: I0310 17:08:44.176398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bss87" event={"ID":"ee003fa3-4ae9-4c90-b431-75cdd2198a06","Type":"ContainerDied","Data":"0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9"} Mar 10 17:08:45 crc kubenswrapper[4749]: I0310 17:08:45.188315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bss87" event={"ID":"ee003fa3-4ae9-4c90-b431-75cdd2198a06","Type":"ContainerStarted","Data":"760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324"} Mar 10 17:08:45 crc kubenswrapper[4749]: I0310 17:08:45.219047 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bss87" podStartSLOduration=3.539260553 podStartE2EDuration="36.219025447s" podCreationTimestamp="2026-03-10 17:08:09 +0000 UTC" firstStartedPulling="2026-03-10 17:08:11.874197708 +0000 UTC m=+4788.996063395" lastFinishedPulling="2026-03-10 17:08:44.553962602 +0000 UTC m=+4821.675828289" observedRunningTime="2026-03-10 17:08:45.21328682 +0000 UTC m=+4822.335152517" watchObservedRunningTime="2026-03-10 17:08:45.219025447 +0000 UTC m=+4822.340891144" Mar 10 17:08:50 crc kubenswrapper[4749]: I0310 17:08:50.298932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:50 crc kubenswrapper[4749]: I0310 17:08:50.299520 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:50 crc kubenswrapper[4749]: I0310 17:08:50.345724 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:51 crc kubenswrapper[4749]: I0310 17:08:51.330205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:51 crc kubenswrapper[4749]: I0310 17:08:51.398440 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bss87"] Mar 10 17:08:52 crc kubenswrapper[4749]: I0310 17:08:52.607864 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:08:52 crc kubenswrapper[4749]: E0310 17:08:52.608225 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.249300 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bss87" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="registry-server" containerID="cri-o://760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324" gracePeriod=2 Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.650342 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.852325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-utilities\") pod \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.854805 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-catalog-content\") pod \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.854697 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-utilities" (OuterVolumeSpecName: "utilities") pod "ee003fa3-4ae9-4c90-b431-75cdd2198a06" (UID: "ee003fa3-4ae9-4c90-b431-75cdd2198a06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.855347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpbzg\" (UniqueName: \"kubernetes.io/projected/ee003fa3-4ae9-4c90-b431-75cdd2198a06-kube-api-access-gpbzg\") pod \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\" (UID: \"ee003fa3-4ae9-4c90-b431-75cdd2198a06\") " Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.856255 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.865003 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee003fa3-4ae9-4c90-b431-75cdd2198a06-kube-api-access-gpbzg" (OuterVolumeSpecName: "kube-api-access-gpbzg") pod "ee003fa3-4ae9-4c90-b431-75cdd2198a06" (UID: "ee003fa3-4ae9-4c90-b431-75cdd2198a06"). InnerVolumeSpecName "kube-api-access-gpbzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.902754 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee003fa3-4ae9-4c90-b431-75cdd2198a06" (UID: "ee003fa3-4ae9-4c90-b431-75cdd2198a06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.958299 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003fa3-4ae9-4c90-b431-75cdd2198a06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:08:53 crc kubenswrapper[4749]: I0310 17:08:53.958351 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpbzg\" (UniqueName: \"kubernetes.io/projected/ee003fa3-4ae9-4c90-b431-75cdd2198a06-kube-api-access-gpbzg\") on node \"crc\" DevicePath \"\"" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.262262 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerID="760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324" exitCode=0 Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.262426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bss87" event={"ID":"ee003fa3-4ae9-4c90-b431-75cdd2198a06","Type":"ContainerDied","Data":"760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324"} Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.262868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bss87" event={"ID":"ee003fa3-4ae9-4c90-b431-75cdd2198a06","Type":"ContainerDied","Data":"8304ba97b148b394ceb1b9bf575607434477d68c3187e13dc1732c2b00d04b2d"} Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.262456 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bss87" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.262909 4749 scope.go:117] "RemoveContainer" containerID="760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.305433 4749 scope.go:117] "RemoveContainer" containerID="0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.315273 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bss87"] Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.339443 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bss87"] Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.340638 4749 scope.go:117] "RemoveContainer" containerID="faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.375854 4749 scope.go:117] "RemoveContainer" containerID="760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324" Mar 10 17:08:54 crc kubenswrapper[4749]: E0310 17:08:54.376798 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324\": container with ID starting with 760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324 not found: ID does not exist" containerID="760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.376846 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324"} err="failed to get container status \"760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324\": rpc error: code = NotFound desc = could not find container \"760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324\": container with ID starting with 760689dc0d27d3448a7b9503421f31857c9e0432beb69f45c014c7c18ea2a324 not found: ID does not exist" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.376879 4749 scope.go:117] "RemoveContainer" containerID="0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9" Mar 10 17:08:54 crc kubenswrapper[4749]: E0310 17:08:54.377099 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9\": container with ID starting with 0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9 not found: ID does not exist" containerID="0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.377135 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9"} err="failed to get container status \"0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9\": rpc error: code = NotFound desc = could not find container \"0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9\": container with ID starting with 0c0a3bed61c01d6121da66461371fcfabba306077b5235aae61bbfdc613a72d9 not found: ID does not exist" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.377159 4749 scope.go:117] "RemoveContainer" containerID="faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5" Mar 10 17:08:54 crc kubenswrapper[4749]: E0310 17:08:54.377526 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5\": container with ID starting with faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5 not found: ID does not exist" containerID="faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5" Mar 10 17:08:54 crc kubenswrapper[4749]: I0310 17:08:54.377567 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5"} err="failed to get container status \"faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5\": rpc error: code = NotFound desc = could not find container \"faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5\": container with ID starting with faf45083bb6ac8dd9ae4ea4c1a89648a710ea178cf7a14bcfe9a429705cd40f5 not found: ID does not exist" Mar 10 17:08:55 crc kubenswrapper[4749]: I0310 17:08:55.614575 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" path="/var/lib/kubelet/pods/ee003fa3-4ae9-4c90-b431-75cdd2198a06/volumes" Mar 10 17:09:05 crc kubenswrapper[4749]: I0310 17:09:05.612915 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:09:05 crc kubenswrapper[4749]: E0310 17:09:05.614627 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:09:16 crc kubenswrapper[4749]: I0310 17:09:16.606317 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:09:16 crc kubenswrapper[4749]: E0310 17:09:16.607250 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:09:31 crc kubenswrapper[4749]: I0310 17:09:31.609704 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:09:31 crc kubenswrapper[4749]: E0310 17:09:31.610969 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.505164 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ls45l"] Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.515904 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ls45l"] Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.626567 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-rpml6"] Mar 10 17:09:32 crc kubenswrapper[4749]: E0310 17:09:32.626845 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="extract-utilities" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.626857 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="extract-utilities" Mar 10 17:09:32 crc kubenswrapper[4749]: E0310 17:09:32.626877 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="extract-content" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.626883 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="extract-content" Mar 10 17:09:32 crc kubenswrapper[4749]: E0310 17:09:32.626896 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="registry-server" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.626901 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="registry-server" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.627034 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee003fa3-4ae9-4c90-b431-75cdd2198a06" containerName="registry-server" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.627463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.630177 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.631173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.631369 4749 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jnthk" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.632260 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.651294 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rpml6"] Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.732564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rwhk\" (UniqueName: \"kubernetes.io/projected/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-kube-api-access-8rwhk\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.732664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-crc-storage\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.732947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-node-mnt\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.834165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-node-mnt\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.834245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rwhk\" (UniqueName: \"kubernetes.io/projected/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-kube-api-access-8rwhk\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.834276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-crc-storage\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.834589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-node-mnt\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.835680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-crc-storage\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.856205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rwhk\" (UniqueName: \"kubernetes.io/projected/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-kube-api-access-8rwhk\") pod \"crc-storage-crc-rpml6\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:32 crc kubenswrapper[4749]: I0310 17:09:32.955834 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:33 crc kubenswrapper[4749]: I0310 17:09:33.236600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-rpml6"] Mar 10 17:09:33 crc kubenswrapper[4749]: I0310 17:09:33.589490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rpml6" event={"ID":"6b9f64c9-e3b2-4fb5-a927-0a06e145715a","Type":"ContainerStarted","Data":"163baeb239c1706d3e35455241d98ac7f1b6f36b9921da2da1a8ca1ee4e6195a"} Mar 10 17:09:33 crc kubenswrapper[4749]: I0310 17:09:33.618996 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3818d3-7881-44ef-afd9-cb50f8a4bf4c" path="/var/lib/kubelet/pods/cf3818d3-7881-44ef-afd9-cb50f8a4bf4c/volumes" Mar 10 17:09:34 crc kubenswrapper[4749]: I0310 17:09:34.598206 4749 generic.go:334] "Generic (PLEG): container finished" podID="6b9f64c9-e3b2-4fb5-a927-0a06e145715a" containerID="ea0e01fd6cafbf133ff89f605773f476559a340189d659f4d017c62899b2278c" exitCode=0 Mar 10 17:09:34 crc kubenswrapper[4749]: I0310 17:09:34.598294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rpml6" event={"ID":"6b9f64c9-e3b2-4fb5-a927-0a06e145715a","Type":"ContainerDied","Data":"ea0e01fd6cafbf133ff89f605773f476559a340189d659f4d017c62899b2278c"} Mar 10 17:09:35 crc kubenswrapper[4749]: I0310 17:09:35.926567 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.078465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-crc-storage\") pod \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.078544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rwhk\" (UniqueName: \"kubernetes.io/projected/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-kube-api-access-8rwhk\") pod \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.078576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-node-mnt\") pod \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\" (UID: \"6b9f64c9-e3b2-4fb5-a927-0a06e145715a\") " Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.078910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6b9f64c9-e3b2-4fb5-a927-0a06e145715a" (UID: "6b9f64c9-e3b2-4fb5-a927-0a06e145715a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.085058 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-kube-api-access-8rwhk" (OuterVolumeSpecName: "kube-api-access-8rwhk") pod "6b9f64c9-e3b2-4fb5-a927-0a06e145715a" (UID: "6b9f64c9-e3b2-4fb5-a927-0a06e145715a"). InnerVolumeSpecName "kube-api-access-8rwhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.112184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6b9f64c9-e3b2-4fb5-a927-0a06e145715a" (UID: "6b9f64c9-e3b2-4fb5-a927-0a06e145715a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.180234 4749 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.180574 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rwhk\" (UniqueName: \"kubernetes.io/projected/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-kube-api-access-8rwhk\") on node \"crc\" DevicePath \"\"" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.180723 4749 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b9f64c9-e3b2-4fb5-a927-0a06e145715a-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.614035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-rpml6" event={"ID":"6b9f64c9-e3b2-4fb5-a927-0a06e145715a","Type":"ContainerDied","Data":"163baeb239c1706d3e35455241d98ac7f1b6f36b9921da2da1a8ca1ee4e6195a"} Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.614076 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-rpml6" Mar 10 17:09:36 crc kubenswrapper[4749]: I0310 17:09:36.614104 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="163baeb239c1706d3e35455241d98ac7f1b6f36b9921da2da1a8ca1ee4e6195a" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.131813 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-rpml6"] Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.140666 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-rpml6"] Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.313652 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6k8fr"] Mar 10 17:09:38 crc kubenswrapper[4749]: E0310 17:09:38.314108 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9f64c9-e3b2-4fb5-a927-0a06e145715a" containerName="storage" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.314136 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9f64c9-e3b2-4fb5-a927-0a06e145715a" containerName="storage" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.315747 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9f64c9-e3b2-4fb5-a927-0a06e145715a" containerName="storage" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.316584 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.320185 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.320327 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.324342 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.325093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jnthk" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.327268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6k8fr"] Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.413517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/14d48496-5a99-4e1e-81e8-bf2360f0889c-crc-storage\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.413616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsnwg\" (UniqueName: \"kubernetes.io/projected/14d48496-5a99-4e1e-81e8-bf2360f0889c-kube-api-access-lsnwg\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.413658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/14d48496-5a99-4e1e-81e8-bf2360f0889c-node-mnt\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.515136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/14d48496-5a99-4e1e-81e8-bf2360f0889c-crc-storage\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.515214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsnwg\" (UniqueName: \"kubernetes.io/projected/14d48496-5a99-4e1e-81e8-bf2360f0889c-kube-api-access-lsnwg\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.515242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/14d48496-5a99-4e1e-81e8-bf2360f0889c-node-mnt\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.515538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/14d48496-5a99-4e1e-81e8-bf2360f0889c-node-mnt\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.516784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/14d48496-5a99-4e1e-81e8-bf2360f0889c-crc-storage\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.674657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsnwg\" (UniqueName: \"kubernetes.io/projected/14d48496-5a99-4e1e-81e8-bf2360f0889c-kube-api-access-lsnwg\") pod \"crc-storage-crc-6k8fr\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:38 crc kubenswrapper[4749]: I0310 17:09:38.935282 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:39 crc kubenswrapper[4749]: I0310 17:09:39.409452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6k8fr"] Mar 10 17:09:39 crc kubenswrapper[4749]: I0310 17:09:39.614429 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9f64c9-e3b2-4fb5-a927-0a06e145715a" path="/var/lib/kubelet/pods/6b9f64c9-e3b2-4fb5-a927-0a06e145715a/volumes" Mar 10 17:09:39 crc kubenswrapper[4749]: I0310 17:09:39.634655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6k8fr" event={"ID":"14d48496-5a99-4e1e-81e8-bf2360f0889c","Type":"ContainerStarted","Data":"710587f654605f23fb1c78d216fc3cc762aa1896f21c1b6f69f5586b7cccb2f6"} Mar 10 17:09:40 crc kubenswrapper[4749]: I0310 17:09:40.645070 4749 generic.go:334] "Generic (PLEG): container finished" podID="14d48496-5a99-4e1e-81e8-bf2360f0889c" containerID="2ca632b3c2263971cb961f2f8854f7d146ad777938950597ad89f680d97d18ea" exitCode=0 Mar 10 17:09:40 crc kubenswrapper[4749]: I0310 17:09:40.645126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6k8fr" event={"ID":"14d48496-5a99-4e1e-81e8-bf2360f0889c","Type":"ContainerDied","Data":"2ca632b3c2263971cb961f2f8854f7d146ad777938950597ad89f680d97d18ea"} Mar 10 17:09:41 crc kubenswrapper[4749]: I0310 17:09:41.971548 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.068645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/14d48496-5a99-4e1e-81e8-bf2360f0889c-node-mnt\") pod \"14d48496-5a99-4e1e-81e8-bf2360f0889c\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.068722 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsnwg\" (UniqueName: \"kubernetes.io/projected/14d48496-5a99-4e1e-81e8-bf2360f0889c-kube-api-access-lsnwg\") pod \"14d48496-5a99-4e1e-81e8-bf2360f0889c\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.068760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/14d48496-5a99-4e1e-81e8-bf2360f0889c-crc-storage\") pod \"14d48496-5a99-4e1e-81e8-bf2360f0889c\" (UID: \"14d48496-5a99-4e1e-81e8-bf2360f0889c\") " Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.068920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14d48496-5a99-4e1e-81e8-bf2360f0889c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "14d48496-5a99-4e1e-81e8-bf2360f0889c" (UID: "14d48496-5a99-4e1e-81e8-bf2360f0889c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.078895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d48496-5a99-4e1e-81e8-bf2360f0889c-kube-api-access-lsnwg" (OuterVolumeSpecName: "kube-api-access-lsnwg") pod "14d48496-5a99-4e1e-81e8-bf2360f0889c" (UID: "14d48496-5a99-4e1e-81e8-bf2360f0889c"). InnerVolumeSpecName "kube-api-access-lsnwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.095416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d48496-5a99-4e1e-81e8-bf2360f0889c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "14d48496-5a99-4e1e-81e8-bf2360f0889c" (UID: "14d48496-5a99-4e1e-81e8-bf2360f0889c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.176641 4749 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/14d48496-5a99-4e1e-81e8-bf2360f0889c-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.176683 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsnwg\" (UniqueName: \"kubernetes.io/projected/14d48496-5a99-4e1e-81e8-bf2360f0889c-kube-api-access-lsnwg\") on node \"crc\" DevicePath \"\"" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.176698 4749 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/14d48496-5a99-4e1e-81e8-bf2360f0889c-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.662341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6k8fr" event={"ID":"14d48496-5a99-4e1e-81e8-bf2360f0889c","Type":"ContainerDied","Data":"710587f654605f23fb1c78d216fc3cc762aa1896f21c1b6f69f5586b7cccb2f6"} Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.662489 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710587f654605f23fb1c78d216fc3cc762aa1896f21c1b6f69f5586b7cccb2f6" Mar 10 17:09:42 crc kubenswrapper[4749]: I0310 17:09:42.662414 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6k8fr" Mar 10 17:09:43 crc kubenswrapper[4749]: I0310 17:09:43.610985 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:09:43 crc kubenswrapper[4749]: E0310 17:09:43.611625 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:09:57 crc kubenswrapper[4749]: I0310 17:09:57.608239 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:09:57 crc kubenswrapper[4749]: E0310 17:09:57.610589 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.175667 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552710-vnmvn"] Mar 10 17:10:00 crc kubenswrapper[4749]: E0310 17:10:00.178059 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d48496-5a99-4e1e-81e8-bf2360f0889c" containerName="storage" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.178232 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d48496-5a99-4e1e-81e8-bf2360f0889c" containerName="storage" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.178669 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d48496-5a99-4e1e-81e8-bf2360f0889c" containerName="storage" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.179847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.183685 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.184060 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.184497 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.193015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552710-vnmvn"] Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.361140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr62n\" (UniqueName: \"kubernetes.io/projected/2f4f3dcc-791c-4146-abdc-24b11d5a85fa-kube-api-access-vr62n\") pod \"auto-csr-approver-29552710-vnmvn\" (UID: \"2f4f3dcc-791c-4146-abdc-24b11d5a85fa\") " pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.462820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr62n\" (UniqueName: \"kubernetes.io/projected/2f4f3dcc-791c-4146-abdc-24b11d5a85fa-kube-api-access-vr62n\") pod \"auto-csr-approver-29552710-vnmvn\" (UID: \"2f4f3dcc-791c-4146-abdc-24b11d5a85fa\") " pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.499563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr62n\" (UniqueName: \"kubernetes.io/projected/2f4f3dcc-791c-4146-abdc-24b11d5a85fa-kube-api-access-vr62n\") pod \"auto-csr-approver-29552710-vnmvn\" (UID: \"2f4f3dcc-791c-4146-abdc-24b11d5a85fa\") " pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.517364 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:00 crc kubenswrapper[4749]: I0310 17:10:00.963013 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552710-vnmvn"] Mar 10 17:10:01 crc kubenswrapper[4749]: I0310 17:10:01.828143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" event={"ID":"2f4f3dcc-791c-4146-abdc-24b11d5a85fa","Type":"ContainerStarted","Data":"20a4b6b69b526470bc94abcd31b8d1cfa06b52e364b8519a7b066fbd89a73a05"} Mar 10 17:10:03 crc kubenswrapper[4749]: I0310 17:10:03.849378 4749 generic.go:334] "Generic (PLEG): container finished" podID="2f4f3dcc-791c-4146-abdc-24b11d5a85fa" containerID="eceb9ebe21bdc849273f8d8896390da1d4cfedeca81205eafe0717718e8d3ffe" exitCode=0 Mar 10 17:10:03 crc kubenswrapper[4749]: I0310 17:10:03.849437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" event={"ID":"2f4f3dcc-791c-4146-abdc-24b11d5a85fa","Type":"ContainerDied","Data":"eceb9ebe21bdc849273f8d8896390da1d4cfedeca81205eafe0717718e8d3ffe"} Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.141946 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.237005 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr62n\" (UniqueName: \"kubernetes.io/projected/2f4f3dcc-791c-4146-abdc-24b11d5a85fa-kube-api-access-vr62n\") pod \"2f4f3dcc-791c-4146-abdc-24b11d5a85fa\" (UID: \"2f4f3dcc-791c-4146-abdc-24b11d5a85fa\") " Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.243231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4f3dcc-791c-4146-abdc-24b11d5a85fa-kube-api-access-vr62n" (OuterVolumeSpecName: "kube-api-access-vr62n") pod "2f4f3dcc-791c-4146-abdc-24b11d5a85fa" (UID: "2f4f3dcc-791c-4146-abdc-24b11d5a85fa"). InnerVolumeSpecName "kube-api-access-vr62n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.338620 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr62n\" (UniqueName: \"kubernetes.io/projected/2f4f3dcc-791c-4146-abdc-24b11d5a85fa-kube-api-access-vr62n\") on node \"crc\" DevicePath \"\"" Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.867086 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" event={"ID":"2f4f3dcc-791c-4146-abdc-24b11d5a85fa","Type":"ContainerDied","Data":"20a4b6b69b526470bc94abcd31b8d1cfa06b52e364b8519a7b066fbd89a73a05"} Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.867144 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a4b6b69b526470bc94abcd31b8d1cfa06b52e364b8519a7b066fbd89a73a05" Mar 10 17:10:05 crc kubenswrapper[4749]: I0310 17:10:05.867233 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552710-vnmvn" Mar 10 17:10:06 crc kubenswrapper[4749]: I0310 17:10:06.224077 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552704-rtrmd"] Mar 10 17:10:06 crc kubenswrapper[4749]: I0310 17:10:06.229949 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552704-rtrmd"] Mar 10 17:10:07 crc kubenswrapper[4749]: I0310 17:10:07.621671 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e869a60-e444-417b-9256-e63fe1b7cfa0" path="/var/lib/kubelet/pods/4e869a60-e444-417b-9256-e63fe1b7cfa0/volumes" Mar 10 17:10:08 crc kubenswrapper[4749]: I0310 17:10:08.608323 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:10:08 crc kubenswrapper[4749]: E0310 17:10:08.608959 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:10:22 crc kubenswrapper[4749]: I0310 17:10:22.610364 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:10:22 crc kubenswrapper[4749]: E0310 17:10:22.611620 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:10:26 crc kubenswrapper[4749]: I0310 17:10:26.827437 4749 scope.go:117] "RemoveContainer" containerID="5081b4a2d9a3aa46c6b557f9c7d0279c98311e5391bc8c6b27c7d9ef0ea42e7c" Mar 10 17:10:26 crc kubenswrapper[4749]: I0310 17:10:26.848317 4749 scope.go:117] "RemoveContainer" containerID="3cc7a88cdab6ba4db24b312ed2ad280070807e43ddf3934e2da2d72697432650" Mar 10 17:10:34 crc kubenswrapper[4749]: I0310 17:10:34.606649 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:10:34 crc kubenswrapper[4749]: E0310 17:10:34.607849 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:10:48 crc kubenswrapper[4749]: I0310 17:10:48.607137 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:10:48 crc kubenswrapper[4749]: E0310 17:10:48.608166 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:10:59 crc kubenswrapper[4749]: I0310 17:10:59.607204 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:11:00 crc kubenswrapper[4749]: I0310 17:11:00.845614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"3dc6824f04ac0b657b53434f59fc2f1cf018f8ac81376732d74b1a55125ca1c7"} Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.161488 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-46znw"] Mar 10 17:11:39 crc kubenswrapper[4749]: E0310 17:11:39.162511 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4f3dcc-791c-4146-abdc-24b11d5a85fa" containerName="oc" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.162532 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4f3dcc-791c-4146-abdc-24b11d5a85fa" containerName="oc" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.162693 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4f3dcc-791c-4146-abdc-24b11d5a85fa" containerName="oc" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.163562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.168042 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jqsdz" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.168286 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.168513 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.168763 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.172486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl26k\" (UniqueName: \"kubernetes.io/projected/dd456761-80be-4280-ac4a-4a9f80d6b3ed-kube-api-access-nl26k\") pod \"dnsmasq-dns-c44667757-46znw\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.172594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd456761-80be-4280-ac4a-4a9f80d6b3ed-config\") pod \"dnsmasq-dns-c44667757-46znw\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.174898 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-ktwwk"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.176102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.181062 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.184428 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-46znw"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.206621 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-ktwwk"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.274628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.274690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl26k\" (UniqueName: \"kubernetes.io/projected/dd456761-80be-4280-ac4a-4a9f80d6b3ed-kube-api-access-nl26k\") pod \"dnsmasq-dns-c44667757-46znw\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.274737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd456761-80be-4280-ac4a-4a9f80d6b3ed-config\") pod \"dnsmasq-dns-c44667757-46znw\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.274766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmv4c\" (UniqueName: \"kubernetes.io/projected/7ead0fba-de6a-4587-a7fa-053cea769d38-kube-api-access-xmv4c\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.274797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-config\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.276013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd456761-80be-4280-ac4a-4a9f80d6b3ed-config\") pod \"dnsmasq-dns-c44667757-46znw\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.299157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl26k\" (UniqueName: \"kubernetes.io/projected/dd456761-80be-4280-ac4a-4a9f80d6b3ed-kube-api-access-nl26k\") pod \"dnsmasq-dns-c44667757-46znw\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.376264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmv4c\" (UniqueName: \"kubernetes.io/projected/7ead0fba-de6a-4587-a7fa-053cea769d38-kube-api-access-xmv4c\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.376325 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-config\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.376395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.377453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-config\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.377453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.409597 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmv4c\" (UniqueName: \"kubernetes.io/projected/7ead0fba-de6a-4587-a7fa-053cea769d38-kube-api-access-xmv4c\") pod \"dnsmasq-dns-55c76fd6b7-ktwwk\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.412458 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-ktwwk"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.415853 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.456629 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5b778c5-f4bk8"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.458337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.477666 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-dns-svc\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.477755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-config\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.477783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf8nb\" (UniqueName: \"kubernetes.io/projected/b13ba1e9-1a24-407e-8cea-a78a4a65429b-kube-api-access-wf8nb\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.482222 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5b778c5-f4bk8"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.490652 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.580103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-dns-svc\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.580161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-config\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.580180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf8nb\" (UniqueName: \"kubernetes.io/projected/b13ba1e9-1a24-407e-8cea-a78a4a65429b-kube-api-access-wf8nb\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.581248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-dns-svc\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.582807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-config\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.605589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf8nb\" (UniqueName: \"kubernetes.io/projected/b13ba1e9-1a24-407e-8cea-a78a4a65429b-kube-api-access-wf8nb\") pod \"dnsmasq-dns-76b5b778c5-f4bk8\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.820092 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b5b778c5-f4bk8"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.823679 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.836416 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-bc4l8"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.838034 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.849987 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-bc4l8"] Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.992906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-dns-svc\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.993220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-config\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:39 crc kubenswrapper[4749]: I0310 17:11:39.993252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-748ws\" (UniqueName: \"kubernetes.io/projected/fda915ab-2411-484d-9fa8-7b80374f90cf-kube-api-access-748ws\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.083212 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-46znw"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.095988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-dns-svc\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.096097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-config\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.096149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-748ws\" (UniqueName: \"kubernetes.io/projected/fda915ab-2411-484d-9fa8-7b80374f90cf-kube-api-access-748ws\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.096894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-config\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.096994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-dns-svc\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.115057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-748ws\" (UniqueName: \"kubernetes.io/projected/fda915ab-2411-484d-9fa8-7b80374f90cf-kube-api-access-748ws\") pod \"dnsmasq-dns-ff89b6977-bc4l8\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.179718 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-ktwwk"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.180050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.221499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" event={"ID":"7ead0fba-de6a-4587-a7fa-053cea769d38","Type":"ContainerStarted","Data":"52c395c0c9d632f2db0879bbfcde9918208591bd01bc12075e82cb47a542f96a"} Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.223389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-46znw" event={"ID":"dd456761-80be-4280-ac4a-4a9f80d6b3ed","Type":"ContainerStarted","Data":"a522b429f72dc3d2575724e3e6281b8972ae0896f3e732ce5df2fc93bebb0283"} Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.282032 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b5b778c5-f4bk8"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.602067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-bc4l8"] Mar 10 17:11:40 crc kubenswrapper[4749]: W0310 17:11:40.646789 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfda915ab_2411_484d_9fa8_7b80374f90cf.slice/crio-64d5c5b88c650c057437811b818765f13b9a0ceaf51058fe6d61e84b3bb6e39a WatchSource:0}: Error finding container 64d5c5b88c650c057437811b818765f13b9a0ceaf51058fe6d61e84b3bb6e39a: Status 404 returned error can't find the container with id 64d5c5b88c650c057437811b818765f13b9a0ceaf51058fe6d61e84b3bb6e39a Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.666435 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.669097 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.671736 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.677813 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.678113 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.678259 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.678432 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.678611 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.678746 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nrlmg" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.720603 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462db90c-a38b-45ca-9b68-6b7178e52fbe-pod-info\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-server-conf\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462db90c-a38b-45ca-9b68-6b7178e52fbe-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-config-data\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.810674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77tmm\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-kube-api-access-77tmm\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.911804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-config-data\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77tmm\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-kube-api-access-77tmm\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462db90c-a38b-45ca-9b68-6b7178e52fbe-pod-info\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-server-conf\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.912947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462db90c-a38b-45ca-9b68-6b7178e52fbe-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.915657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-config-data\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.915946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.916056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-server-conf\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.916414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.916500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.917648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462db90c-a38b-45ca-9b68-6b7178e52fbe-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.918031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462db90c-a38b-45ca-9b68-6b7178e52fbe-pod-info\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.918943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.918985 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.919014 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de8c21fdda9e792f16e762fc10558fb2d006b79f3397dc6da868dfd14049f346/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.921290 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.923115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.923734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.934215 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.934351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vrs88" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.934455 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.934237 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.934723 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.934914 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.936418 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.936735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.939899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77tmm\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-kube-api-access-77tmm\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:40 crc kubenswrapper[4749]: I0310 17:11:40.967876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " pod="openstack/rabbitmq-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.015190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.015546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.015856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.015993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.016143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.016256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.016429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6zxh\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-kube-api-access-s6zxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.016564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.016753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.016875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.017057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.040461 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.118569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.118844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6zxh\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-kube-api-access-s6zxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.119981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.120022 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.120043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.120080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.120754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.120910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.120940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.121172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.123875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.124621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.125422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.126312 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.126340 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1a757f527bbff44cb7c7a8e9bf61d5f3057fd9c8b31a62a220fe4eb82fbadd55/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.128212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.139485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6zxh\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-kube-api-access-s6zxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.168583 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.234423 4749 generic.go:334] "Generic (PLEG): container finished" podID="b13ba1e9-1a24-407e-8cea-a78a4a65429b" containerID="9d122d910422b9f411f9820a59d0de7218f2ed4f1af523e9f2cc8c6b42aee734" exitCode=0 Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.234471 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" event={"ID":"b13ba1e9-1a24-407e-8cea-a78a4a65429b","Type":"ContainerDied","Data":"9d122d910422b9f411f9820a59d0de7218f2ed4f1af523e9f2cc8c6b42aee734"} Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.234536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" event={"ID":"b13ba1e9-1a24-407e-8cea-a78a4a65429b","Type":"ContainerStarted","Data":"65d0bc54e68157d44016d7b54b3869fd3274e58c33ed98ed0a3b4fd8a68d1fd8"} Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.237027 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ead0fba-de6a-4587-a7fa-053cea769d38" containerID="fbc2a53a3de161acdeb03b30e8287567f97d2223f3457c2dcd9354ed1dd606db" exitCode=0 Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.237068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" event={"ID":"7ead0fba-de6a-4587-a7fa-053cea769d38","Type":"ContainerDied","Data":"fbc2a53a3de161acdeb03b30e8287567f97d2223f3457c2dcd9354ed1dd606db"} Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.239600 4749 generic.go:334] "Generic (PLEG): container finished" podID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerID="9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392" exitCode=0 Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.239642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" event={"ID":"fda915ab-2411-484d-9fa8-7b80374f90cf","Type":"ContainerDied","Data":"9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392"} Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.239657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" event={"ID":"fda915ab-2411-484d-9fa8-7b80374f90cf","Type":"ContainerStarted","Data":"64d5c5b88c650c057437811b818765f13b9a0ceaf51058fe6d61e84b3bb6e39a"} Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.249170 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerID="991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe" exitCode=0 Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.249207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-46znw" event={"ID":"dd456761-80be-4280-ac4a-4a9f80d6b3ed","Type":"ContainerDied","Data":"991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe"} Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.286007 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.355009 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:11:41 crc kubenswrapper[4749]: W0310 17:11:41.399605 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod462db90c_a38b_45ca_9b68_6b7178e52fbe.slice/crio-cd685b32e7b3ec21cb07e69e2784716ecfc2ba1fcbf0a0fa36ac33b508972820 WatchSource:0}: Error finding container cd685b32e7b3ec21cb07e69e2784716ecfc2ba1fcbf0a0fa36ac33b508972820: Status 404 returned error can't find the container with id cd685b32e7b3ec21cb07e69e2784716ecfc2ba1fcbf0a0fa36ac33b508972820 Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.637612 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.651608 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.727403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-dns-svc\") pod \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.727465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf8nb\" (UniqueName: \"kubernetes.io/projected/b13ba1e9-1a24-407e-8cea-a78a4a65429b-kube-api-access-wf8nb\") pod \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.727548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-config\") pod \"7ead0fba-de6a-4587-a7fa-053cea769d38\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.727630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmv4c\" (UniqueName: \"kubernetes.io/projected/7ead0fba-de6a-4587-a7fa-053cea769d38-kube-api-access-xmv4c\") pod \"7ead0fba-de6a-4587-a7fa-053cea769d38\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.727668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-dns-svc\") pod \"7ead0fba-de6a-4587-a7fa-053cea769d38\" (UID: \"7ead0fba-de6a-4587-a7fa-053cea769d38\") " Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.727686 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-config\") pod \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\" (UID: \"b13ba1e9-1a24-407e-8cea-a78a4a65429b\") " Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.767518 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ead0fba-de6a-4587-a7fa-053cea769d38-kube-api-access-xmv4c" (OuterVolumeSpecName: "kube-api-access-xmv4c") pod "7ead0fba-de6a-4587-a7fa-053cea769d38" (UID: "7ead0fba-de6a-4587-a7fa-053cea769d38"). InnerVolumeSpecName "kube-api-access-xmv4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.767848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13ba1e9-1a24-407e-8cea-a78a4a65429b-kube-api-access-wf8nb" (OuterVolumeSpecName: "kube-api-access-wf8nb") pod "b13ba1e9-1a24-407e-8cea-a78a4a65429b" (UID: "b13ba1e9-1a24-407e-8cea-a78a4a65429b"). InnerVolumeSpecName "kube-api-access-wf8nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.830588 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf8nb\" (UniqueName: \"kubernetes.io/projected/b13ba1e9-1a24-407e-8cea-a78a4a65429b-kube-api-access-wf8nb\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.830623 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmv4c\" (UniqueName: \"kubernetes.io/projected/7ead0fba-de6a-4587-a7fa-053cea769d38-kube-api-access-xmv4c\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.864543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.881339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ead0fba-de6a-4587-a7fa-053cea769d38" (UID: "7ead0fba-de6a-4587-a7fa-053cea769d38"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.883714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-config" (OuterVolumeSpecName: "config") pod "b13ba1e9-1a24-407e-8cea-a78a4a65429b" (UID: "b13ba1e9-1a24-407e-8cea-a78a4a65429b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.893314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b13ba1e9-1a24-407e-8cea-a78a4a65429b" (UID: "b13ba1e9-1a24-407e-8cea-a78a4a65429b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.897585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-config" (OuterVolumeSpecName: "config") pod "7ead0fba-de6a-4587-a7fa-053cea769d38" (UID: "7ead0fba-de6a-4587-a7fa-053cea769d38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.932268 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.932803 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.932813 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13ba1e9-1a24-407e-8cea-a78a4a65429b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:41 crc kubenswrapper[4749]: I0310 17:11:41.932821 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ead0fba-de6a-4587-a7fa-053cea769d38-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.096017 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 17:11:42 crc kubenswrapper[4749]: E0310 17:11:42.096659 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13ba1e9-1a24-407e-8cea-a78a4a65429b" containerName="init" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.096770 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13ba1e9-1a24-407e-8cea-a78a4a65429b" containerName="init" Mar 10 17:11:42 crc kubenswrapper[4749]: E0310 17:11:42.096872 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ead0fba-de6a-4587-a7fa-053cea769d38" containerName="init" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.096954 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ead0fba-de6a-4587-a7fa-053cea769d38" containerName="init" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.097220 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13ba1e9-1a24-407e-8cea-a78a4a65429b" containerName="init" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.097322 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ead0fba-de6a-4587-a7fa-053cea769d38" containerName="init" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.098325 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.101965 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.102593 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.102858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-82gbj" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.103093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.109318 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.116070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e297480-747d-4673-965d-dd06d23c11c1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e297480-747d-4673-965d-dd06d23c11c1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236814 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn8zk\" (UniqueName: \"kubernetes.io/projected/9e297480-747d-4673-965d-dd06d23c11c1-kube-api-access-qn8zk\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-config-data-default\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-kolla-config\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.236994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e297480-747d-4673-965d-dd06d23c11c1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.257799 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.257787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55c76fd6b7-ktwwk" event={"ID":"7ead0fba-de6a-4587-a7fa-053cea769d38","Type":"ContainerDied","Data":"52c395c0c9d632f2db0879bbfcde9918208591bd01bc12075e82cb47a542f96a"} Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.257956 4749 scope.go:117] "RemoveContainer" containerID="fbc2a53a3de161acdeb03b30e8287567f97d2223f3457c2dcd9354ed1dd606db" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.260219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" event={"ID":"fda915ab-2411-484d-9fa8-7b80374f90cf","Type":"ContainerStarted","Data":"7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41"} Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.260801 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.262774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-46znw" event={"ID":"dd456761-80be-4280-ac4a-4a9f80d6b3ed","Type":"ContainerStarted","Data":"da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054"} Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.263434 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.265016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462db90c-a38b-45ca-9b68-6b7178e52fbe","Type":"ContainerStarted","Data":"cd685b32e7b3ec21cb07e69e2784716ecfc2ba1fcbf0a0fa36ac33b508972820"} Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.267482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" event={"ID":"b13ba1e9-1a24-407e-8cea-a78a4a65429b","Type":"ContainerDied","Data":"65d0bc54e68157d44016d7b54b3869fd3274e58c33ed98ed0a3b4fd8a68d1fd8"} Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.267565 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5b778c5-f4bk8" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.268395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4","Type":"ContainerStarted","Data":"9e6fad4dc5ad41577653c1be7e9db3353d332e44f69d25c810226a819a7345e1"} Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.293174 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" podStartSLOduration=3.293153463 podStartE2EDuration="3.293153463s" podCreationTimestamp="2026-03-10 17:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:11:42.285983057 +0000 UTC m=+4999.407848744" watchObservedRunningTime="2026-03-10 17:11:42.293153463 +0000 UTC m=+4999.415019160" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.310934 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c44667757-46znw" podStartSLOduration=3.310913636 podStartE2EDuration="3.310913636s" podCreationTimestamp="2026-03-10 17:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:11:42.305753686 +0000 UTC m=+4999.427619363" watchObservedRunningTime="2026-03-10 17:11:42.310913636 +0000 UTC m=+4999.432779323" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.338247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-config-data-default\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.338621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-kolla-config\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.338728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e297480-747d-4673-965d-dd06d23c11c1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.338898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e297480-747d-4673-965d-dd06d23c11c1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.338988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.339177 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e297480-747d-4673-965d-dd06d23c11c1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.339297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.339443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn8zk\" (UniqueName: \"kubernetes.io/projected/9e297480-747d-4673-965d-dd06d23c11c1-kube-api-access-qn8zk\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.341107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-config-data-default\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.341722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-kolla-config\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.343549 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76b5b778c5-f4bk8"] Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.344292 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e297480-747d-4673-965d-dd06d23c11c1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.345393 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e297480-747d-4673-965d-dd06d23c11c1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.347190 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.347234 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a5d1e5659c47f94f55f45ef37904c2a366f5d2be4560a3142905613948b454a/globalmount\"" pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.372618 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76b5b778c5-f4bk8"] Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.377289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e297480-747d-4673-965d-dd06d23c11c1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.377453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e297480-747d-4673-965d-dd06d23c11c1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.394901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ab26cb0-34e2-4add-a26c-11239e915d65\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.395222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn8zk\" (UniqueName: \"kubernetes.io/projected/9e297480-747d-4673-965d-dd06d23c11c1-kube-api-access-qn8zk\") pod \"openstack-galera-0\" (UID: \"9e297480-747d-4673-965d-dd06d23c11c1\") " pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.403434 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-ktwwk"] Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.422764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.429317 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-ktwwk"] Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.478472 4749 scope.go:117] "RemoveContainer" containerID="9d122d910422b9f411f9820a59d0de7218f2ed4f1af523e9f2cc8c6b42aee734" Mar 10 17:11:42 crc kubenswrapper[4749]: I0310 17:11:42.809921 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 17:11:42 crc kubenswrapper[4749]: W0310 17:11:42.824980 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e297480_747d_4673_965d_dd06d23c11c1.slice/crio-e4f296ca2ae2df9392340c51e8bc1ea344fd414383aa4532c57033cad9d8e4a8 WatchSource:0}: Error finding container e4f296ca2ae2df9392340c51e8bc1ea344fd414383aa4532c57033cad9d8e4a8: Status 404 returned error can't find the container with id e4f296ca2ae2df9392340c51e8bc1ea344fd414383aa4532c57033cad9d8e4a8 Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.280217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4","Type":"ContainerStarted","Data":"f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e"} Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.285532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9e297480-747d-4673-965d-dd06d23c11c1","Type":"ContainerStarted","Data":"a0dfc4715332ed5ecc42d4ac0b959c0d2e423c8f52419d179d69c878fd95ee25"} Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.285571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9e297480-747d-4673-965d-dd06d23c11c1","Type":"ContainerStarted","Data":"e4f296ca2ae2df9392340c51e8bc1ea344fd414383aa4532c57033cad9d8e4a8"} Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.288206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462db90c-a38b-45ca-9b68-6b7178e52fbe","Type":"ContainerStarted","Data":"bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10"} Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.435827 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.437220 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.439164 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.439497 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.442659 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lt2fq" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.442809 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.452052 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b8004f-cc41-4d8d-ba10-c18e745159e6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93b8004f-cc41-4d8d-ba10-c18e745159e6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b8004f-cc41-4d8d-ba10-c18e745159e6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.559746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzx6w\" (UniqueName: \"kubernetes.io/projected/93b8004f-cc41-4d8d-ba10-c18e745159e6-kube-api-access-wzx6w\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.616160 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ead0fba-de6a-4587-a7fa-053cea769d38" path="/var/lib/kubelet/pods/7ead0fba-de6a-4587-a7fa-053cea769d38/volumes" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.616812 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13ba1e9-1a24-407e-8cea-a78a4a65429b" path="/var/lib/kubelet/pods/b13ba1e9-1a24-407e-8cea-a78a4a65429b/volumes" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.660826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.660864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b8004f-cc41-4d8d-ba10-c18e745159e6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.660888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.660919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.660952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzx6w\" (UniqueName: \"kubernetes.io/projected/93b8004f-cc41-4d8d-ba10-c18e745159e6-kube-api-access-wzx6w\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.661008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b8004f-cc41-4d8d-ba10-c18e745159e6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.661044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93b8004f-cc41-4d8d-ba10-c18e745159e6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.661082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.661737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93b8004f-cc41-4d8d-ba10-c18e745159e6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.661985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.661985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.662866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93b8004f-cc41-4d8d-ba10-c18e745159e6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.680010 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.680072 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/07e07be1fac8f34cbf092794f9ff33d7685ce38b08f1ed5322ba3af900cfdec0/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.682363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b8004f-cc41-4d8d-ba10-c18e745159e6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.686537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b8004f-cc41-4d8d-ba10-c18e745159e6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.692881 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzx6w\" (UniqueName: \"kubernetes.io/projected/93b8004f-cc41-4d8d-ba10-c18e745159e6-kube-api-access-wzx6w\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.717018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1b80082-7575-4799-afd0-c31aa7fcd450\") pod \"openstack-cell1-galera-0\" (UID: \"93b8004f-cc41-4d8d-ba10-c18e745159e6\") " pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.754881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.857763 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.858969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.861841 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.862137 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.862800 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4b7gv" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.884078 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.967003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjnmc\" (UniqueName: \"kubernetes.io/projected/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-kube-api-access-pjnmc\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.967058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.967083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-kolla-config\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.967097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:43 crc kubenswrapper[4749]: I0310 17:11:43.967149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-config-data\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.070401 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjnmc\" (UniqueName: \"kubernetes.io/projected/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-kube-api-access-pjnmc\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.070452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.070475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-kolla-config\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.070500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.070536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-config-data\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.072022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-kolla-config\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.072248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-config-data\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.077132 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.077733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.086917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjnmc\" (UniqueName: \"kubernetes.io/projected/097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d-kube-api-access-pjnmc\") pod \"memcached-0\" (UID: \"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d\") " pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.177847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.294975 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 17:11:44 crc kubenswrapper[4749]: W0310 17:11:44.297138 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93b8004f_cc41_4d8d_ba10_c18e745159e6.slice/crio-8295c7a2b671eab1f50e1a773c8cdcfbe67012ddf9e23d3feace957baf1230b0 WatchSource:0}: Error finding container 8295c7a2b671eab1f50e1a773c8cdcfbe67012ddf9e23d3feace957baf1230b0: Status 404 returned error can't find the container with id 8295c7a2b671eab1f50e1a773c8cdcfbe67012ddf9e23d3feace957baf1230b0 Mar 10 17:11:44 crc kubenswrapper[4749]: I0310 17:11:44.597332 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 17:11:45 crc kubenswrapper[4749]: I0310 17:11:45.309954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93b8004f-cc41-4d8d-ba10-c18e745159e6","Type":"ContainerStarted","Data":"6981193bf2987e860a0857db6908d406818f97e08e13b7e35a3e1fbd4fb097a4"} Mar 10 17:11:45 crc kubenswrapper[4749]: I0310 17:11:45.311204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93b8004f-cc41-4d8d-ba10-c18e745159e6","Type":"ContainerStarted","Data":"8295c7a2b671eab1f50e1a773c8cdcfbe67012ddf9e23d3feace957baf1230b0"} Mar 10 17:11:45 crc kubenswrapper[4749]: I0310 17:11:45.312110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d","Type":"ContainerStarted","Data":"c9d0278df0f9655e0a2040ca4ed6de158627d8e28f65acfe4ef28c3955c1f70e"} Mar 10 17:11:45 crc kubenswrapper[4749]: I0310 17:11:45.312207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d","Type":"ContainerStarted","Data":"7b77cec6fb2e7f6bcf21ce3a4ae4729d52436f0d80f9eed6992ffc1d55450cb5"} Mar 10 17:11:45 crc kubenswrapper[4749]: I0310 17:11:45.312316 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 17:11:45 crc kubenswrapper[4749]: I0310 17:11:45.351348 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.35132867 podStartE2EDuration="2.35132867s" podCreationTimestamp="2026-03-10 17:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:11:45.345329326 +0000 UTC m=+5002.467195033" watchObservedRunningTime="2026-03-10 17:11:45.35132867 +0000 UTC m=+5002.473194367" Mar 10 17:11:46 crc kubenswrapper[4749]: I0310 17:11:46.320774 4749 generic.go:334] "Generic (PLEG): container finished" podID="9e297480-747d-4673-965d-dd06d23c11c1" containerID="a0dfc4715332ed5ecc42d4ac0b959c0d2e423c8f52419d179d69c878fd95ee25" exitCode=0 Mar 10 17:11:46 crc kubenswrapper[4749]: I0310 17:11:46.320870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9e297480-747d-4673-965d-dd06d23c11c1","Type":"ContainerDied","Data":"a0dfc4715332ed5ecc42d4ac0b959c0d2e423c8f52419d179d69c878fd95ee25"} Mar 10 17:11:47 crc kubenswrapper[4749]: I0310 17:11:47.339054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9e297480-747d-4673-965d-dd06d23c11c1","Type":"ContainerStarted","Data":"a3ff46bb9c1104f6ca1ed8cfc7acc9f8c69d08e8fe6e432d4abb0986b9dc2543"} Mar 10 17:11:47 crc kubenswrapper[4749]: I0310 17:11:47.372512 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=6.372489485 podStartE2EDuration="6.372489485s" podCreationTimestamp="2026-03-10 17:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:11:47.359358327 +0000 UTC m=+5004.481224024" watchObservedRunningTime="2026-03-10 17:11:47.372489485 +0000 UTC m=+5004.494355182" Mar 10 17:11:48 crc kubenswrapper[4749]: I0310 17:11:48.350798 4749 generic.go:334] "Generic (PLEG): container finished" podID="93b8004f-cc41-4d8d-ba10-c18e745159e6" containerID="6981193bf2987e860a0857db6908d406818f97e08e13b7e35a3e1fbd4fb097a4" exitCode=0 Mar 10 17:11:48 crc kubenswrapper[4749]: I0310 17:11:48.350906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93b8004f-cc41-4d8d-ba10-c18e745159e6","Type":"ContainerDied","Data":"6981193bf2987e860a0857db6908d406818f97e08e13b7e35a3e1fbd4fb097a4"} Mar 10 17:11:49 crc kubenswrapper[4749]: I0310 17:11:49.179044 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 17:11:49 crc kubenswrapper[4749]: I0310 17:11:49.358909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"93b8004f-cc41-4d8d-ba10-c18e745159e6","Type":"ContainerStarted","Data":"5945f4376f19e3922ce765ea9804406c6beafc84cfa892583a5e6100b836f0a3"} Mar 10 17:11:49 crc kubenswrapper[4749]: I0310 17:11:49.385624 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.385601931 podStartE2EDuration="7.385601931s" podCreationTimestamp="2026-03-10 17:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:11:49.378783645 +0000 UTC m=+5006.500649332" watchObservedRunningTime="2026-03-10 17:11:49.385601931 +0000 UTC m=+5006.507467628" Mar 10 17:11:49 crc kubenswrapper[4749]: I0310 17:11:49.492645 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.181668 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.257225 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-46znw"] Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.365981 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c44667757-46znw" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerName="dnsmasq-dns" containerID="cri-o://da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054" gracePeriod=10 Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.780167 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.900746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd456761-80be-4280-ac4a-4a9f80d6b3ed-config\") pod \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.900832 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl26k\" (UniqueName: \"kubernetes.io/projected/dd456761-80be-4280-ac4a-4a9f80d6b3ed-kube-api-access-nl26k\") pod \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\" (UID: \"dd456761-80be-4280-ac4a-4a9f80d6b3ed\") " Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.906144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd456761-80be-4280-ac4a-4a9f80d6b3ed-kube-api-access-nl26k" (OuterVolumeSpecName: "kube-api-access-nl26k") pod "dd456761-80be-4280-ac4a-4a9f80d6b3ed" (UID: "dd456761-80be-4280-ac4a-4a9f80d6b3ed"). InnerVolumeSpecName "kube-api-access-nl26k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:11:50 crc kubenswrapper[4749]: I0310 17:11:50.949310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd456761-80be-4280-ac4a-4a9f80d6b3ed-config" (OuterVolumeSpecName: "config") pod "dd456761-80be-4280-ac4a-4a9f80d6b3ed" (UID: "dd456761-80be-4280-ac4a-4a9f80d6b3ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.002774 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd456761-80be-4280-ac4a-4a9f80d6b3ed-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.002814 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl26k\" (UniqueName: \"kubernetes.io/projected/dd456761-80be-4280-ac4a-4a9f80d6b3ed-kube-api-access-nl26k\") on node \"crc\" DevicePath \"\"" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.101624 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pj4x8"] Mar 10 17:11:51 crc kubenswrapper[4749]: E0310 17:11:51.101962 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerName="dnsmasq-dns" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.101986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerName="dnsmasq-dns" Mar 10 17:11:51 crc kubenswrapper[4749]: E0310 17:11:51.102017 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerName="init" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.102027 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerName="init" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.102186 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerName="dnsmasq-dns" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.103443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.120606 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pj4x8"] Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.205956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-catalog-content\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.206055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-utilities\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.206105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbbn\" (UniqueName: \"kubernetes.io/projected/586828c3-61e9-405d-b005-0b0b265ea71e-kube-api-access-kxbbn\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.307361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-catalog-content\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.307478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-utilities\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.307519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbbn\" (UniqueName: \"kubernetes.io/projected/586828c3-61e9-405d-b005-0b0b265ea71e-kube-api-access-kxbbn\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.308133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-catalog-content\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.308184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-utilities\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.337510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbbn\" (UniqueName: \"kubernetes.io/projected/586828c3-61e9-405d-b005-0b0b265ea71e-kube-api-access-kxbbn\") pod \"community-operators-pj4x8\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.375529 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" containerID="da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054" exitCode=0 Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.375576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-46znw" event={"ID":"dd456761-80be-4280-ac4a-4a9f80d6b3ed","Type":"ContainerDied","Data":"da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054"} Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.375603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-46znw" event={"ID":"dd456761-80be-4280-ac4a-4a9f80d6b3ed","Type":"ContainerDied","Data":"a522b429f72dc3d2575724e3e6281b8972ae0896f3e732ce5df2fc93bebb0283"} Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.375608 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-46znw" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.375642 4749 scope.go:117] "RemoveContainer" containerID="da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.391991 4749 scope.go:117] "RemoveContainer" containerID="991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.409117 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-46znw"] Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.415287 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-46znw"] Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.423892 4749 scope.go:117] "RemoveContainer" containerID="da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054" Mar 10 17:11:51 crc kubenswrapper[4749]: E0310 17:11:51.424483 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054\": container with ID starting with da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054 not found: ID does not exist" containerID="da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.424530 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054"} err="failed to get container status \"da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054\": rpc error: code = NotFound desc = could not find container \"da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054\": container with ID starting with da90d052932412e758423712fc8c5a58d0bfdcc3c86dfddfe1f840d45c7c2054 not found: ID does not exist" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.424555 4749 scope.go:117] "RemoveContainer" containerID="991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe" Mar 10 17:11:51 crc kubenswrapper[4749]: E0310 17:11:51.424898 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe\": container with ID starting with 991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe not found: ID does not exist" containerID="991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.424924 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe"} err="failed to get container status \"991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe\": rpc error: code = NotFound desc = could not find container \"991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe\": container with ID starting with 991af77f2ae99436c0f732b47c66caa8fb1b640add9ee2ba92e18b29aca8dfbe not found: ID does not exist" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.436538 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.624991 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd456761-80be-4280-ac4a-4a9f80d6b3ed" path="/var/lib/kubelet/pods/dd456761-80be-4280-ac4a-4a9f80d6b3ed/volumes" Mar 10 17:11:51 crc kubenswrapper[4749]: E0310 17:11:51.811662 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.199:44526->38.102.83.199:40501: write tcp 38.102.83.199:44526->38.102.83.199:40501: write: broken pipe Mar 10 17:11:51 crc kubenswrapper[4749]: I0310 17:11:51.961666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pj4x8"] Mar 10 17:11:51 crc kubenswrapper[4749]: W0310 17:11:51.965709 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod586828c3_61e9_405d_b005_0b0b265ea71e.slice/crio-49a441b974f5b41ce9e6b9c94c1707192df92b5568b6f2c87d0d27d3e99c5e76 WatchSource:0}: Error finding container 49a441b974f5b41ce9e6b9c94c1707192df92b5568b6f2c87d0d27d3e99c5e76: Status 404 returned error can't find the container with id 49a441b974f5b41ce9e6b9c94c1707192df92b5568b6f2c87d0d27d3e99c5e76 Mar 10 17:11:52 crc kubenswrapper[4749]: I0310 17:11:52.384791 4749 generic.go:334] "Generic (PLEG): container finished" podID="586828c3-61e9-405d-b005-0b0b265ea71e" containerID="deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b" exitCode=0 Mar 10 17:11:52 crc kubenswrapper[4749]: I0310 17:11:52.384903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pj4x8" event={"ID":"586828c3-61e9-405d-b005-0b0b265ea71e","Type":"ContainerDied","Data":"deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b"} Mar 10 17:11:52 crc kubenswrapper[4749]: I0310 17:11:52.385166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pj4x8" event={"ID":"586828c3-61e9-405d-b005-0b0b265ea71e","Type":"ContainerStarted","Data":"49a441b974f5b41ce9e6b9c94c1707192df92b5568b6f2c87d0d27d3e99c5e76"} Mar 10 17:11:52 crc kubenswrapper[4749]: I0310 17:11:52.387845 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:11:52 crc kubenswrapper[4749]: I0310 17:11:52.423864 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 17:11:52 crc kubenswrapper[4749]: I0310 17:11:52.423959 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 17:11:53 crc kubenswrapper[4749]: I0310 17:11:53.755451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:53 crc kubenswrapper[4749]: I0310 17:11:53.755527 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:54 crc kubenswrapper[4749]: I0310 17:11:54.404064 4749 generic.go:334] "Generic (PLEG): container finished" podID="586828c3-61e9-405d-b005-0b0b265ea71e" containerID="0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6" exitCode=0 Mar 10 17:11:54 crc kubenswrapper[4749]: I0310 17:11:54.404144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pj4x8" event={"ID":"586828c3-61e9-405d-b005-0b0b265ea71e","Type":"ContainerDied","Data":"0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6"} Mar 10 17:11:54 crc kubenswrapper[4749]: I0310 17:11:54.758020 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 17:11:54 crc kubenswrapper[4749]: I0310 17:11:54.832805 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 17:11:55 crc kubenswrapper[4749]: I0310 17:11:55.418370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pj4x8" event={"ID":"586828c3-61e9-405d-b005-0b0b265ea71e","Type":"ContainerStarted","Data":"6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948"} Mar 10 17:11:55 crc kubenswrapper[4749]: I0310 17:11:55.460489 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pj4x8" podStartSLOduration=1.951186578 podStartE2EDuration="4.460466175s" podCreationTimestamp="2026-03-10 17:11:51 +0000 UTC" firstStartedPulling="2026-03-10 17:11:52.387017312 +0000 UTC m=+5009.508883009" lastFinishedPulling="2026-03-10 17:11:54.896296909 +0000 UTC m=+5012.018162606" observedRunningTime="2026-03-10 17:11:55.453959468 +0000 UTC m=+5012.575825165" watchObservedRunningTime="2026-03-10 17:11:55.460466175 +0000 UTC m=+5012.582331862" Mar 10 17:11:56 crc kubenswrapper[4749]: I0310 17:11:56.119265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 17:11:56 crc kubenswrapper[4749]: I0310 17:11:56.223719 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.154214 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552712-scffb"] Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.156308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.159895 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.159916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.160022 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.179073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552712-scffb"] Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.350504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nfx6\" (UniqueName: \"kubernetes.io/projected/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94-kube-api-access-8nfx6\") pod \"auto-csr-approver-29552712-scffb\" (UID: \"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94\") " pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.452025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nfx6\" (UniqueName: \"kubernetes.io/projected/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94-kube-api-access-8nfx6\") pod \"auto-csr-approver-29552712-scffb\" (UID: \"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94\") " pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.489085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nfx6\" (UniqueName: \"kubernetes.io/projected/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94-kube-api-access-8nfx6\") pod \"auto-csr-approver-29552712-scffb\" (UID: \"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94\") " pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:00 crc kubenswrapper[4749]: I0310 17:12:00.568501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.006113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552712-scffb"] Mar 10 17:12:01 crc kubenswrapper[4749]: W0310 17:12:01.008574 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486a1f15_4abd_46a9_9ce1_aa79e0e3cc94.slice/crio-2e87feb5bb94a273d1b2fd164e5ca9498f3572f6b9d6a273640e4357dc93dcf6 WatchSource:0}: Error finding container 2e87feb5bb94a273d1b2fd164e5ca9498f3572f6b9d6a273640e4357dc93dcf6: Status 404 returned error can't find the container with id 2e87feb5bb94a273d1b2fd164e5ca9498f3572f6b9d6a273640e4357dc93dcf6 Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.056899 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8227q"] Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.058191 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.060547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df48470e-0356-46bc-a02e-f8dcbccc903c-operator-scripts\") pod \"root-account-create-update-8227q\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.061050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkmzq\" (UniqueName: \"kubernetes.io/projected/df48470e-0356-46bc-a02e-f8dcbccc903c-kube-api-access-zkmzq\") pod \"root-account-create-update-8227q\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.061832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.077458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8227q"] Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.162753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df48470e-0356-46bc-a02e-f8dcbccc903c-operator-scripts\") pod \"root-account-create-update-8227q\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.162881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkmzq\" (UniqueName: \"kubernetes.io/projected/df48470e-0356-46bc-a02e-f8dcbccc903c-kube-api-access-zkmzq\") pod \"root-account-create-update-8227q\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.163788 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df48470e-0356-46bc-a02e-f8dcbccc903c-operator-scripts\") pod \"root-account-create-update-8227q\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.195596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkmzq\" (UniqueName: \"kubernetes.io/projected/df48470e-0356-46bc-a02e-f8dcbccc903c-kube-api-access-zkmzq\") pod \"root-account-create-update-8227q\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.384057 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8227q" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.436897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.436990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.485100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552712-scffb" event={"ID":"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94","Type":"ContainerStarted","Data":"2e87feb5bb94a273d1b2fd164e5ca9498f3572f6b9d6a273640e4357dc93dcf6"} Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.530432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.595850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.769166 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pj4x8"] Mar 10 17:12:01 crc kubenswrapper[4749]: I0310 17:12:01.862498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8227q"] Mar 10 17:12:02 crc kubenswrapper[4749]: I0310 17:12:02.498447 4749 generic.go:334] "Generic (PLEG): container finished" podID="df48470e-0356-46bc-a02e-f8dcbccc903c" containerID="4d408357a8dbec6ca56306ddafe1831fb6ef34697be703f74a01c6a2bca06762" exitCode=0 Mar 10 17:12:02 crc kubenswrapper[4749]: I0310 17:12:02.498525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8227q" event={"ID":"df48470e-0356-46bc-a02e-f8dcbccc903c","Type":"ContainerDied","Data":"4d408357a8dbec6ca56306ddafe1831fb6ef34697be703f74a01c6a2bca06762"} Mar 10 17:12:02 crc kubenswrapper[4749]: I0310 17:12:02.498849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8227q" event={"ID":"df48470e-0356-46bc-a02e-f8dcbccc903c","Type":"ContainerStarted","Data":"8b60a9a05e6b9ffb1b12ad63b867993116e02298e8d268574f87a050d5729a9d"} Mar 10 17:12:02 crc kubenswrapper[4749]: I0310 17:12:02.500658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552712-scffb" event={"ID":"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94","Type":"ContainerStarted","Data":"d59309f9557305938549ce3caa72d348360e893c84dcafc3bc235fb065b736bb"} Mar 10 17:12:02 crc kubenswrapper[4749]: I0310 17:12:02.545801 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552712-scffb" podStartSLOduration=1.486485568 podStartE2EDuration="2.545783217s" podCreationTimestamp="2026-03-10 17:12:00 +0000 UTC" firstStartedPulling="2026-03-10 17:12:01.011357509 +0000 UTC m=+5018.133223236" lastFinishedPulling="2026-03-10 17:12:02.070655198 +0000 UTC m=+5019.192520885" observedRunningTime="2026-03-10 17:12:02.535633411 +0000 UTC m=+5019.657499098" watchObservedRunningTime="2026-03-10 17:12:02.545783217 +0000 UTC m=+5019.667648904" Mar 10 17:12:03 crc kubenswrapper[4749]: I0310 17:12:03.528531 4749 generic.go:334] "Generic (PLEG): container finished" podID="486a1f15-4abd-46a9-9ce1-aa79e0e3cc94" containerID="d59309f9557305938549ce3caa72d348360e893c84dcafc3bc235fb065b736bb" exitCode=0 Mar 10 17:12:03 crc kubenswrapper[4749]: I0310 17:12:03.529053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552712-scffb" event={"ID":"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94","Type":"ContainerDied","Data":"d59309f9557305938549ce3caa72d348360e893c84dcafc3bc235fb065b736bb"} Mar 10 17:12:03 crc kubenswrapper[4749]: I0310 17:12:03.529364 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pj4x8" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="registry-server" containerID="cri-o://6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948" gracePeriod=2 Mar 10 17:12:03 crc kubenswrapper[4749]: I0310 17:12:03.982802 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8227q" Mar 10 17:12:03 crc kubenswrapper[4749]: I0310 17:12:03.991340 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.110221 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-utilities\") pod \"586828c3-61e9-405d-b005-0b0b265ea71e\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.110281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-catalog-content\") pod \"586828c3-61e9-405d-b005-0b0b265ea71e\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.110441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df48470e-0356-46bc-a02e-f8dcbccc903c-operator-scripts\") pod \"df48470e-0356-46bc-a02e-f8dcbccc903c\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.110488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxbbn\" (UniqueName: \"kubernetes.io/projected/586828c3-61e9-405d-b005-0b0b265ea71e-kube-api-access-kxbbn\") pod \"586828c3-61e9-405d-b005-0b0b265ea71e\" (UID: \"586828c3-61e9-405d-b005-0b0b265ea71e\") " Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.110669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkmzq\" (UniqueName: \"kubernetes.io/projected/df48470e-0356-46bc-a02e-f8dcbccc903c-kube-api-access-zkmzq\") pod \"df48470e-0356-46bc-a02e-f8dcbccc903c\" (UID: \"df48470e-0356-46bc-a02e-f8dcbccc903c\") " Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.111650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df48470e-0356-46bc-a02e-f8dcbccc903c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df48470e-0356-46bc-a02e-f8dcbccc903c" (UID: "df48470e-0356-46bc-a02e-f8dcbccc903c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.113302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-utilities" (OuterVolumeSpecName: "utilities") pod "586828c3-61e9-405d-b005-0b0b265ea71e" (UID: "586828c3-61e9-405d-b005-0b0b265ea71e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.115835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df48470e-0356-46bc-a02e-f8dcbccc903c-kube-api-access-zkmzq" (OuterVolumeSpecName: "kube-api-access-zkmzq") pod "df48470e-0356-46bc-a02e-f8dcbccc903c" (UID: "df48470e-0356-46bc-a02e-f8dcbccc903c"). InnerVolumeSpecName "kube-api-access-zkmzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.117909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586828c3-61e9-405d-b005-0b0b265ea71e-kube-api-access-kxbbn" (OuterVolumeSpecName: "kube-api-access-kxbbn") pod "586828c3-61e9-405d-b005-0b0b265ea71e" (UID: "586828c3-61e9-405d-b005-0b0b265ea71e"). InnerVolumeSpecName "kube-api-access-kxbbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.212663 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxbbn\" (UniqueName: \"kubernetes.io/projected/586828c3-61e9-405d-b005-0b0b265ea71e-kube-api-access-kxbbn\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.212704 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkmzq\" (UniqueName: \"kubernetes.io/projected/df48470e-0356-46bc-a02e-f8dcbccc903c-kube-api-access-zkmzq\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.212718 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.212730 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df48470e-0356-46bc-a02e-f8dcbccc903c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.364252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "586828c3-61e9-405d-b005-0b0b265ea71e" (UID: "586828c3-61e9-405d-b005-0b0b265ea71e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.416112 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586828c3-61e9-405d-b005-0b0b265ea71e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.552669 4749 generic.go:334] "Generic (PLEG): container finished" podID="586828c3-61e9-405d-b005-0b0b265ea71e" containerID="6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948" exitCode=0 Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.552781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pj4x8" event={"ID":"586828c3-61e9-405d-b005-0b0b265ea71e","Type":"ContainerDied","Data":"6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948"} Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.552826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pj4x8" event={"ID":"586828c3-61e9-405d-b005-0b0b265ea71e","Type":"ContainerDied","Data":"49a441b974f5b41ce9e6b9c94c1707192df92b5568b6f2c87d0d27d3e99c5e76"} Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.552856 4749 scope.go:117] "RemoveContainer" containerID="6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.553032 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pj4x8" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.562722 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8227q" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.562790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8227q" event={"ID":"df48470e-0356-46bc-a02e-f8dcbccc903c","Type":"ContainerDied","Data":"8b60a9a05e6b9ffb1b12ad63b867993116e02298e8d268574f87a050d5729a9d"} Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.562832 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b60a9a05e6b9ffb1b12ad63b867993116e02298e8d268574f87a050d5729a9d" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.596634 4749 scope.go:117] "RemoveContainer" containerID="0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.639283 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pj4x8"] Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.652038 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pj4x8"] Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.671669 4749 scope.go:117] "RemoveContainer" containerID="deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.739749 4749 scope.go:117] "RemoveContainer" containerID="6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948" Mar 10 17:12:04 crc kubenswrapper[4749]: E0310 17:12:04.740235 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948\": container with ID starting with 6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948 not found: ID does not exist" containerID="6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.740281 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948"} err="failed to get container status \"6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948\": rpc error: code = NotFound desc = could not find container \"6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948\": container with ID starting with 6140f41495fffd924c5a153f205a8ca51d3780e8286ca13c28c86bee8b475948 not found: ID does not exist" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.740308 4749 scope.go:117] "RemoveContainer" containerID="0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6" Mar 10 17:12:04 crc kubenswrapper[4749]: E0310 17:12:04.740601 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6\": container with ID starting with 0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6 not found: ID does not exist" containerID="0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.740626 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6"} err="failed to get container status \"0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6\": rpc error: code = NotFound desc = could not find container \"0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6\": container with ID starting with 0b6133143ffeac60b1ea2efc0dd60c48e1b6aa83eac1a8c1344ba370446051f6 not found: ID does not exist" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.740640 4749 scope.go:117] "RemoveContainer" containerID="deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b" Mar 10 17:12:04 crc kubenswrapper[4749]: E0310 17:12:04.741465 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b\": container with ID starting with deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b not found: ID does not exist" containerID="deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b" Mar 10 17:12:04 crc kubenswrapper[4749]: I0310 17:12:04.741504 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b"} err="failed to get container status \"deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b\": rpc error: code = NotFound desc = could not find container \"deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b\": container with ID starting with deec1da9d05f1c826c9d6ca7d512c32afa6ca32ee6f6213ab18e9596f5b69e4b not found: ID does not exist" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.018531 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.130512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nfx6\" (UniqueName: \"kubernetes.io/projected/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94-kube-api-access-8nfx6\") pod \"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94\" (UID: \"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94\") " Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.138162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94-kube-api-access-8nfx6" (OuterVolumeSpecName: "kube-api-access-8nfx6") pod "486a1f15-4abd-46a9-9ce1-aa79e0e3cc94" (UID: "486a1f15-4abd-46a9-9ce1-aa79e0e3cc94"). InnerVolumeSpecName "kube-api-access-8nfx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.232464 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nfx6\" (UniqueName: \"kubernetes.io/projected/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94-kube-api-access-8nfx6\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.578713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552712-scffb" event={"ID":"486a1f15-4abd-46a9-9ce1-aa79e0e3cc94","Type":"ContainerDied","Data":"2e87feb5bb94a273d1b2fd164e5ca9498f3572f6b9d6a273640e4357dc93dcf6"} Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.579179 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e87feb5bb94a273d1b2fd164e5ca9498f3572f6b9d6a273640e4357dc93dcf6" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.578800 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552712-scffb" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.621704 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" path="/var/lib/kubelet/pods/586828c3-61e9-405d-b005-0b0b265ea71e/volumes" Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.625188 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552706-m9rpd"] Mar 10 17:12:05 crc kubenswrapper[4749]: I0310 17:12:05.632149 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552706-m9rpd"] Mar 10 17:12:07 crc kubenswrapper[4749]: I0310 17:12:07.410754 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8227q"] Mar 10 17:12:07 crc kubenswrapper[4749]: I0310 17:12:07.417836 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8227q"] Mar 10 17:12:07 crc kubenswrapper[4749]: I0310 17:12:07.620416 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c25b49-4263-4673-b2eb-0b423da84861" path="/var/lib/kubelet/pods/70c25b49-4263-4673-b2eb-0b423da84861/volumes" Mar 10 17:12:07 crc kubenswrapper[4749]: I0310 17:12:07.621686 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df48470e-0356-46bc-a02e-f8dcbccc903c" path="/var/lib/kubelet/pods/df48470e-0356-46bc-a02e-f8dcbccc903c/volumes" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.427907 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7cq8f"] Mar 10 17:12:12 crc kubenswrapper[4749]: E0310 17:12:12.428595 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="registry-server" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428614 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="registry-server" Mar 10 17:12:12 crc kubenswrapper[4749]: E0310 17:12:12.428635 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="extract-utilities" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428646 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="extract-utilities" Mar 10 17:12:12 crc kubenswrapper[4749]: E0310 17:12:12.428664 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486a1f15-4abd-46a9-9ce1-aa79e0e3cc94" containerName="oc" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428672 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="486a1f15-4abd-46a9-9ce1-aa79e0e3cc94" containerName="oc" Mar 10 17:12:12 crc kubenswrapper[4749]: E0310 17:12:12.428684 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df48470e-0356-46bc-a02e-f8dcbccc903c" containerName="mariadb-account-create-update" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428692 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="df48470e-0356-46bc-a02e-f8dcbccc903c" containerName="mariadb-account-create-update" Mar 10 17:12:12 crc kubenswrapper[4749]: E0310 17:12:12.428714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="extract-content" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428722 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="extract-content" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428899 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="586828c3-61e9-405d-b005-0b0b265ea71e" containerName="registry-server" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428922 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="df48470e-0356-46bc-a02e-f8dcbccc903c" containerName="mariadb-account-create-update" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.428936 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="486a1f15-4abd-46a9-9ce1-aa79e0e3cc94" containerName="oc" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.429579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.434076 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.445849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7cq8f"] Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.551346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6x9\" (UniqueName: \"kubernetes.io/projected/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-kube-api-access-6n6x9\") pod \"root-account-create-update-7cq8f\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.551396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-operator-scripts\") pod \"root-account-create-update-7cq8f\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.654555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6x9\" (UniqueName: \"kubernetes.io/projected/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-kube-api-access-6n6x9\") pod \"root-account-create-update-7cq8f\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.654651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-operator-scripts\") pod \"root-account-create-update-7cq8f\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.655986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-operator-scripts\") pod \"root-account-create-update-7cq8f\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.687612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6x9\" (UniqueName: \"kubernetes.io/projected/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-kube-api-access-6n6x9\") pod \"root-account-create-update-7cq8f\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:12 crc kubenswrapper[4749]: I0310 17:12:12.750419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:13 crc kubenswrapper[4749]: I0310 17:12:13.181256 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7cq8f"] Mar 10 17:12:13 crc kubenswrapper[4749]: I0310 17:12:13.654740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7cq8f" event={"ID":"d7a2202b-2b1d-4d85-909e-004bd8fa1d66","Type":"ContainerStarted","Data":"a627c7d51a6873e42fa60a483d9edf9b88fbd5393208fe23619119016e253dc0"} Mar 10 17:12:13 crc kubenswrapper[4749]: I0310 17:12:13.655092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7cq8f" event={"ID":"d7a2202b-2b1d-4d85-909e-004bd8fa1d66","Type":"ContainerStarted","Data":"f912b9b61e5abeaf53a5d60ba320cc1aea6dbe58f2b8a0cf4f837af2f186474b"} Mar 10 17:12:13 crc kubenswrapper[4749]: I0310 17:12:13.679543 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-7cq8f" podStartSLOduration=1.679523747 podStartE2EDuration="1.679523747s" podCreationTimestamp="2026-03-10 17:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:12:13.677114782 +0000 UTC m=+5030.798980469" watchObservedRunningTime="2026-03-10 17:12:13.679523747 +0000 UTC m=+5030.801389434" Mar 10 17:12:14 crc kubenswrapper[4749]: I0310 17:12:14.666810 4749 generic.go:334] "Generic (PLEG): container finished" podID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerID="bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10" exitCode=0 Mar 10 17:12:14 crc kubenswrapper[4749]: I0310 17:12:14.666932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462db90c-a38b-45ca-9b68-6b7178e52fbe","Type":"ContainerDied","Data":"bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10"} Mar 10 17:12:14 crc kubenswrapper[4749]: I0310 17:12:14.672110 4749 generic.go:334] "Generic (PLEG): container finished" podID="d7a2202b-2b1d-4d85-909e-004bd8fa1d66" containerID="a627c7d51a6873e42fa60a483d9edf9b88fbd5393208fe23619119016e253dc0" exitCode=0 Mar 10 17:12:14 crc kubenswrapper[4749]: I0310 17:12:14.672168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7cq8f" event={"ID":"d7a2202b-2b1d-4d85-909e-004bd8fa1d66","Type":"ContainerDied","Data":"a627c7d51a6873e42fa60a483d9edf9b88fbd5393208fe23619119016e253dc0"} Mar 10 17:12:14 crc kubenswrapper[4749]: E0310 17:12:14.980513 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e98f3ae_2fbb_4d5d_aac7_df411f2df1a4.slice/crio-f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e.scope\": RecentStats: unable to find data in memory cache]" Mar 10 17:12:15 crc kubenswrapper[4749]: I0310 17:12:15.682726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462db90c-a38b-45ca-9b68-6b7178e52fbe","Type":"ContainerStarted","Data":"87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc"} Mar 10 17:12:15 crc kubenswrapper[4749]: I0310 17:12:15.683322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 17:12:15 crc kubenswrapper[4749]: I0310 17:12:15.685040 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerID="f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e" exitCode=0 Mar 10 17:12:15 crc kubenswrapper[4749]: I0310 17:12:15.685112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4","Type":"ContainerDied","Data":"f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e"} Mar 10 17:12:15 crc kubenswrapper[4749]: I0310 17:12:15.723332 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.723309178 podStartE2EDuration="36.723309178s" podCreationTimestamp="2026-03-10 17:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:12:15.71787371 +0000 UTC m=+5032.839739417" watchObservedRunningTime="2026-03-10 17:12:15.723309178 +0000 UTC m=+5032.845174865" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.061073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.118312 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-operator-scripts\") pod \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.118643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n6x9\" (UniqueName: \"kubernetes.io/projected/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-kube-api-access-6n6x9\") pod \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\" (UID: \"d7a2202b-2b1d-4d85-909e-004bd8fa1d66\") " Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.119282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7a2202b-2b1d-4d85-909e-004bd8fa1d66" (UID: "d7a2202b-2b1d-4d85-909e-004bd8fa1d66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.123635 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-kube-api-access-6n6x9" (OuterVolumeSpecName: "kube-api-access-6n6x9") pod "d7a2202b-2b1d-4d85-909e-004bd8fa1d66" (UID: "d7a2202b-2b1d-4d85-909e-004bd8fa1d66"). InnerVolumeSpecName "kube-api-access-6n6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.220725 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.220780 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n6x9\" (UniqueName: \"kubernetes.io/projected/d7a2202b-2b1d-4d85-909e-004bd8fa1d66-kube-api-access-6n6x9\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.692581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4","Type":"ContainerStarted","Data":"e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b"} Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.693607 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.695734 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7cq8f" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.696554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7cq8f" event={"ID":"d7a2202b-2b1d-4d85-909e-004bd8fa1d66","Type":"ContainerDied","Data":"f912b9b61e5abeaf53a5d60ba320cc1aea6dbe58f2b8a0cf4f837af2f186474b"} Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.696616 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f912b9b61e5abeaf53a5d60ba320cc1aea6dbe58f2b8a0cf4f837af2f186474b" Mar 10 17:12:16 crc kubenswrapper[4749]: I0310 17:12:16.731377 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.731349772 podStartE2EDuration="37.731349772s" podCreationTimestamp="2026-03-10 17:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:12:16.72615637 +0000 UTC m=+5033.848022057" watchObservedRunningTime="2026-03-10 17:12:16.731349772 +0000 UTC m=+5033.853215459" Mar 10 17:12:26 crc kubenswrapper[4749]: I0310 17:12:26.933502 4749 scope.go:117] "RemoveContainer" containerID="8549f4074d1a6454cd0c68a1c87f4942bba984a093c352256b5e85c28908bff0" Mar 10 17:12:31 crc kubenswrapper[4749]: I0310 17:12:31.044697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 17:12:31 crc kubenswrapper[4749]: I0310 17:12:31.289563 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.672416 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-cq62w"] Mar 10 17:12:39 crc kubenswrapper[4749]: E0310 17:12:39.673068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a2202b-2b1d-4d85-909e-004bd8fa1d66" containerName="mariadb-account-create-update" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.673080 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a2202b-2b1d-4d85-909e-004bd8fa1d66" containerName="mariadb-account-create-update" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.673206 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a2202b-2b1d-4d85-909e-004bd8fa1d66" containerName="mariadb-account-create-update" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.673937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.688545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-cq62w"] Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.826044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2nnd\" (UniqueName: \"kubernetes.io/projected/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-kube-api-access-x2nnd\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.826178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.826520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-config\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.928179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-config\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.928285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2nnd\" (UniqueName: \"kubernetes.io/projected/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-kube-api-access-x2nnd\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.928334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.929219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-config\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.929417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.950858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2nnd\" (UniqueName: \"kubernetes.io/projected/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-kube-api-access-x2nnd\") pod \"dnsmasq-dns-66d5bf7c87-cq62w\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:39 crc kubenswrapper[4749]: I0310 17:12:39.994439 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:40 crc kubenswrapper[4749]: W0310 17:12:40.471698 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cb77f2b_7d48_4ed4_9a9a_a6a28a5d071a.slice/crio-4d3e3a489fffeae0a1dfa4eaa965a1e679571f52d862891ef97a7ccc308e40fb WatchSource:0}: Error finding container 4d3e3a489fffeae0a1dfa4eaa965a1e679571f52d862891ef97a7ccc308e40fb: Status 404 returned error can't find the container with id 4d3e3a489fffeae0a1dfa4eaa965a1e679571f52d862891ef97a7ccc308e40fb Mar 10 17:12:40 crc kubenswrapper[4749]: I0310 17:12:40.473926 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-cq62w"] Mar 10 17:12:40 crc kubenswrapper[4749]: I0310 17:12:40.896917 4749 generic.go:334] "Generic (PLEG): container finished" podID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerID="30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1" exitCode=0 Mar 10 17:12:40 crc kubenswrapper[4749]: I0310 17:12:40.896970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" event={"ID":"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a","Type":"ContainerDied","Data":"30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1"} Mar 10 17:12:40 crc kubenswrapper[4749]: I0310 17:12:40.897300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" event={"ID":"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a","Type":"ContainerStarted","Data":"4d3e3a489fffeae0a1dfa4eaa965a1e679571f52d862891ef97a7ccc308e40fb"} Mar 10 17:12:41 crc kubenswrapper[4749]: I0310 17:12:41.003392 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:12:41 crc kubenswrapper[4749]: I0310 17:12:41.119150 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:12:41 crc kubenswrapper[4749]: I0310 17:12:41.926794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" event={"ID":"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a","Type":"ContainerStarted","Data":"0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879"} Mar 10 17:12:41 crc kubenswrapper[4749]: I0310 17:12:41.927780 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:41 crc kubenswrapper[4749]: I0310 17:12:41.948906 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" podStartSLOduration=2.9488894500000002 podStartE2EDuration="2.94888945s" podCreationTimestamp="2026-03-10 17:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:12:41.943679077 +0000 UTC m=+5059.065544764" watchObservedRunningTime="2026-03-10 17:12:41.94888945 +0000 UTC m=+5059.070755137" Mar 10 17:12:44 crc kubenswrapper[4749]: I0310 17:12:44.843860 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="rabbitmq" containerID="cri-o://87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc" gracePeriod=604797 Mar 10 17:12:45 crc kubenswrapper[4749]: I0310 17:12:45.328027 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="rabbitmq" containerID="cri-o://e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b" gracePeriod=604796 Mar 10 17:12:49 crc kubenswrapper[4749]: I0310 17:12:49.996706 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.118473 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-bc4l8"] Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.118758 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="dnsmasq-dns" containerID="cri-o://7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41" gracePeriod=10 Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.180799 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.20:5353: connect: connection refused" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.621930 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.812629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-748ws\" (UniqueName: \"kubernetes.io/projected/fda915ab-2411-484d-9fa8-7b80374f90cf-kube-api-access-748ws\") pod \"fda915ab-2411-484d-9fa8-7b80374f90cf\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.813094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-dns-svc\") pod \"fda915ab-2411-484d-9fa8-7b80374f90cf\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.813214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-config\") pod \"fda915ab-2411-484d-9fa8-7b80374f90cf\" (UID: \"fda915ab-2411-484d-9fa8-7b80374f90cf\") " Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.818224 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda915ab-2411-484d-9fa8-7b80374f90cf-kube-api-access-748ws" (OuterVolumeSpecName: "kube-api-access-748ws") pod "fda915ab-2411-484d-9fa8-7b80374f90cf" (UID: "fda915ab-2411-484d-9fa8-7b80374f90cf"). InnerVolumeSpecName "kube-api-access-748ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.862922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-config" (OuterVolumeSpecName: "config") pod "fda915ab-2411-484d-9fa8-7b80374f90cf" (UID: "fda915ab-2411-484d-9fa8-7b80374f90cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.868547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fda915ab-2411-484d-9fa8-7b80374f90cf" (UID: "fda915ab-2411-484d-9fa8-7b80374f90cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.915232 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.915303 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fda915ab-2411-484d-9fa8-7b80374f90cf-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.915318 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-748ws\" (UniqueName: \"kubernetes.io/projected/fda915ab-2411-484d-9fa8-7b80374f90cf-kube-api-access-748ws\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.991911 4749 generic.go:334] "Generic (PLEG): container finished" podID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerID="7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41" exitCode=0 Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.991963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" event={"ID":"fda915ab-2411-484d-9fa8-7b80374f90cf","Type":"ContainerDied","Data":"7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41"} Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.991994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" event={"ID":"fda915ab-2411-484d-9fa8-7b80374f90cf","Type":"ContainerDied","Data":"64d5c5b88c650c057437811b818765f13b9a0ceaf51058fe6d61e84b3bb6e39a"} Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.992014 4749 scope.go:117] "RemoveContainer" containerID="7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41" Mar 10 17:12:50 crc kubenswrapper[4749]: I0310 17:12:50.992466 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-bc4l8" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.041681 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.21:5671: connect: connection refused" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.053205 4749 scope.go:117] "RemoveContainer" containerID="9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.068967 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-bc4l8"] Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.080239 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-bc4l8"] Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.131189 4749 scope.go:117] "RemoveContainer" containerID="7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41" Mar 10 17:12:51 crc kubenswrapper[4749]: E0310 17:12:51.131724 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41\": container with ID starting with 7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41 not found: ID does not exist" containerID="7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.131749 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41"} err="failed to get container status \"7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41\": rpc error: code = NotFound desc = could not find container \"7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41\": container with ID starting with 7d4297d3714ba5af5eb7733e68eaec460b0364dbb956b0ea5f4cf7ee1e95ba41 not found: ID does not exist" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.131770 4749 scope.go:117] "RemoveContainer" containerID="9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392" Mar 10 17:12:51 crc kubenswrapper[4749]: E0310 17:12:51.132048 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392\": container with ID starting with 9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392 not found: ID does not exist" containerID="9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.132066 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392"} err="failed to get container status \"9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392\": rpc error: code = NotFound desc = could not find container \"9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392\": container with ID starting with 9d851fb97c5fb49053c3c13df53084049cf72fd891d2984de8d27c99f0224392 not found: ID does not exist" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.287821 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.22:5671: connect: connection refused" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.414437 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-confd\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-server-conf\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77tmm\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-kube-api-access-77tmm\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-config-data\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-plugins\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-erlang-cookie\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-plugins-conf\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-tls\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462db90c-a38b-45ca-9b68-6b7178e52fbe-erlang-cookie-secret\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.423520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462db90c-a38b-45ca-9b68-6b7178e52fbe-pod-info\") pod \"462db90c-a38b-45ca-9b68-6b7178e52fbe\" (UID: \"462db90c-a38b-45ca-9b68-6b7178e52fbe\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.424308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.424989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.425657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.426830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/462db90c-a38b-45ca-9b68-6b7178e52fbe-pod-info" (OuterVolumeSpecName: "pod-info") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.427884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.429086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462db90c-a38b-45ca-9b68-6b7178e52fbe-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.429714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-kube-api-access-77tmm" (OuterVolumeSpecName: "kube-api-access-77tmm") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "kube-api-access-77tmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.452506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-config-data" (OuterVolumeSpecName: "config-data") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.452506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0" (OuterVolumeSpecName: "persistence") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "pvc-c534202a-35a1-4320-92ea-64818eedbbd0". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.496341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-server-conf" (OuterVolumeSpecName: "server-conf") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524429 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") on node \"crc\" " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524464 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524474 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77tmm\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-kube-api-access-77tmm\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524483 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524492 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524502 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524510 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/462db90c-a38b-45ca-9b68-6b7178e52fbe-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524518 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524526 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/462db90c-a38b-45ca-9b68-6b7178e52fbe-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.524534 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/462db90c-a38b-45ca-9b68-6b7178e52fbe-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.557260 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.557443 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c534202a-35a1-4320-92ea-64818eedbbd0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0") on node "crc" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.565624 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "462db90c-a38b-45ca-9b68-6b7178e52fbe" (UID: "462db90c-a38b-45ca-9b68-6b7178e52fbe"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.615566 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" path="/var/lib/kubelet/pods/fda915ab-2411-484d-9fa8-7b80374f90cf/volumes" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.626118 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.626154 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/462db90c-a38b-45ca-9b68-6b7178e52fbe-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.819812 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-tls\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-config-data\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931448 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6zxh\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-kube-api-access-s6zxh\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-pod-info\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931675 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-erlang-cookie-secret\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931729 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-confd\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-erlang-cookie\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-plugins\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-plugins-conf\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.931943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-server-conf\") pod \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\" (UID: \"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4\") " Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.932446 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.932745 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.932764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.935875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-pod-info" (OuterVolumeSpecName: "pod-info") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.938424 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.938627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-kube-api-access-s6zxh" (OuterVolumeSpecName: "kube-api-access-s6zxh") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "kube-api-access-s6zxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.940149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.950691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6" (OuterVolumeSpecName: "persistence") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.960474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-config-data" (OuterVolumeSpecName: "config-data") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:51 crc kubenswrapper[4749]: I0310 17:12:51.985599 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-server-conf" (OuterVolumeSpecName: "server-conf") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.001625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" (UID: "7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.003503 4749 generic.go:334] "Generic (PLEG): container finished" podID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerID="87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc" exitCode=0 Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.003573 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.003606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462db90c-a38b-45ca-9b68-6b7178e52fbe","Type":"ContainerDied","Data":"87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc"} Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.003656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"462db90c-a38b-45ca-9b68-6b7178e52fbe","Type":"ContainerDied","Data":"cd685b32e7b3ec21cb07e69e2784716ecfc2ba1fcbf0a0fa36ac33b508972820"} Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.003687 4749 scope.go:117] "RemoveContainer" containerID="87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.008628 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerID="e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b" exitCode=0 Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.008716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4","Type":"ContainerDied","Data":"e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b"} Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.008773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4","Type":"ContainerDied","Data":"9e6fad4dc5ad41577653c1be7e9db3353d332e44f69d25c810226a819a7345e1"} Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.008871 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.030476 4749 scope.go:117] "RemoveContainer" containerID="bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033434 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") on node \"crc\" " Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033588 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033658 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033733 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033793 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033845 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033904 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.033961 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.034014 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.034075 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6zxh\" (UniqueName: \"kubernetes.io/projected/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-kube-api-access-s6zxh\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.034130 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.041354 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.051446 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.057654 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.066509 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069283 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.069657 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="setup-container" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069680 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="setup-container" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.069699 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="dnsmasq-dns" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069711 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="dnsmasq-dns" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.069737 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="init" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069745 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="init" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.069762 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="rabbitmq" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069770 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="rabbitmq" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.069786 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="setup-container" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069794 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="setup-container" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.069805 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="rabbitmq" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069815 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="rabbitmq" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069948 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda915ab-2411-484d-9fa8-7b80374f90cf" containerName="dnsmasq-dns" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069960 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" containerName="rabbitmq" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.069972 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" containerName="rabbitmq" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.070696 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.075600 4749 scope.go:117] "RemoveContainer" containerID="87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.077287 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc\": container with ID starting with 87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc not found: ID does not exist" containerID="87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077453 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc"} err="failed to get container status \"87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc\": rpc error: code = NotFound desc = could not find container \"87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc\": container with ID starting with 87991de259fc4828f435aa8e9ebb80d28580911161bc7b5aad2aad624e123ddc not found: ID does not exist" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077568 4749 scope.go:117] "RemoveContainer" containerID="bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077462 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077614 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077662 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.077877 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.078346 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.078967 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.079151 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6") on node "crc" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.080572 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nrlmg" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.080698 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10\": container with ID starting with bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10 not found: ID does not exist" containerID="bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.080731 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10"} err="failed to get container status \"bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10\": rpc error: code = NotFound desc = could not find container \"bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10\": container with ID starting with bb68ff55f2f183f37f93f85f2733f28e4393f656c9e0a3a88a953e71d63e3d10 not found: ID does not exist" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.080758 4749 scope.go:117] "RemoveContainer" containerID="e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.086148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.098106 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.100741 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.100916 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.101105 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.101262 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.101390 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vrs88" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.101499 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.107710 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.117270 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.128891 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.129236 4749 scope.go:117] "RemoveContainer" containerID="f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.135179 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") on node \"crc\" DevicePath \"\"" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.163967 4749 scope.go:117] "RemoveContainer" containerID="e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.164406 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b\": container with ID starting with e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b not found: ID does not exist" containerID="e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.164442 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b"} err="failed to get container status \"e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b\": rpc error: code = NotFound desc = could not find container \"e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b\": container with ID starting with e82db328c64aa013b9b1e20bf35df3434346e7a1816af2bdde1236fd13cd3a0b not found: ID does not exist" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.164461 4749 scope.go:117] "RemoveContainer" containerID="f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e" Mar 10 17:12:52 crc kubenswrapper[4749]: E0310 17:12:52.164843 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e\": container with ID starting with f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e not found: ID does not exist" containerID="f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.164894 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e"} err="failed to get container status \"f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e\": rpc error: code = NotFound desc = could not find container \"f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e\": container with ID starting with f42e112e31917c1371ca11a9cfd21d2142663fd32ad7edfc5037e72d11dfef7e not found: ID does not exist" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236455 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbd92\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-kube-api-access-jbd92\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.236838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237295 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a89a76a3-3810-431e-8061-a35fec1eff52-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a89a76a3-3810-431e-8061-a35fec1eff52-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.237591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmrjk\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-kube-api-access-xmrjk\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.338895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.338993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a89a76a3-3810-431e-8061-a35fec1eff52-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a89a76a3-3810-431e-8061-a35fec1eff52-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.339934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmrjk\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-kube-api-access-xmrjk\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbd92\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-kube-api-access-jbd92\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.340969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.341982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.342966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.343139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a89a76a3-3810-431e-8061-a35fec1eff52-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.343850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.346391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.346982 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.347127 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1a757f527bbff44cb7c7a8e9bf61d5f3057fd9c8b31a62a220fe4eb82fbadd55/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.347923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a89a76a3-3810-431e-8061-a35fec1eff52-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.347963 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.349009 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de8c21fdda9e792f16e762fc10558fb2d006b79f3397dc6da868dfd14049f346/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.353012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.357698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.363145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a89a76a3-3810-431e-8061-a35fec1eff52-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.363799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.364018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.364668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.365366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.368823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbd92\" (UniqueName: \"kubernetes.io/projected/7ffde1ba-19eb-4d74-84b7-f64e82c3770f-kube-api-access-jbd92\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.370878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmrjk\" (UniqueName: \"kubernetes.io/projected/a89a76a3-3810-431e-8061-a35fec1eff52-kube-api-access-xmrjk\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.386516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c1dfddc-52b6-4bae-bf09-390eb70752f6\") pod \"rabbitmq-cell1-server-0\" (UID: \"a89a76a3-3810-431e-8061-a35fec1eff52\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.398860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c534202a-35a1-4320-92ea-64818eedbbd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c534202a-35a1-4320-92ea-64818eedbbd0\") pod \"rabbitmq-server-0\" (UID: \"7ffde1ba-19eb-4d74-84b7-f64e82c3770f\") " pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.440316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.698687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 17:12:52 crc kubenswrapper[4749]: I0310 17:12:52.949726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 17:12:52 crc kubenswrapper[4749]: W0310 17:12:52.950977 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89a76a3_3810_431e_8061_a35fec1eff52.slice/crio-e937c7bff596e4595a57c93cb806342f5baa87b7fd8a0413b7d06466ad6a06ec WatchSource:0}: Error finding container e937c7bff596e4595a57c93cb806342f5baa87b7fd8a0413b7d06466ad6a06ec: Status 404 returned error can't find the container with id e937c7bff596e4595a57c93cb806342f5baa87b7fd8a0413b7d06466ad6a06ec Mar 10 17:12:53 crc kubenswrapper[4749]: I0310 17:12:53.023000 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a89a76a3-3810-431e-8061-a35fec1eff52","Type":"ContainerStarted","Data":"e937c7bff596e4595a57c93cb806342f5baa87b7fd8a0413b7d06466ad6a06ec"} Mar 10 17:12:53 crc kubenswrapper[4749]: I0310 17:12:53.127293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 17:12:53 crc kubenswrapper[4749]: W0310 17:12:53.128783 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ffde1ba_19eb_4d74_84b7_f64e82c3770f.slice/crio-38219c763b7e644124f8e52b3f5f87d3e52ccba0d640d43e2d87e2ffacc1e096 WatchSource:0}: Error finding container 38219c763b7e644124f8e52b3f5f87d3e52ccba0d640d43e2d87e2ffacc1e096: Status 404 returned error can't find the container with id 38219c763b7e644124f8e52b3f5f87d3e52ccba0d640d43e2d87e2ffacc1e096 Mar 10 17:12:53 crc kubenswrapper[4749]: I0310 17:12:53.624333 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462db90c-a38b-45ca-9b68-6b7178e52fbe" path="/var/lib/kubelet/pods/462db90c-a38b-45ca-9b68-6b7178e52fbe/volumes" Mar 10 17:12:53 crc kubenswrapper[4749]: I0310 17:12:53.625786 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4" path="/var/lib/kubelet/pods/7e98f3ae-2fbb-4d5d-aac7-df411f2df1a4/volumes" Mar 10 17:12:54 crc kubenswrapper[4749]: I0310 17:12:54.030424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ffde1ba-19eb-4d74-84b7-f64e82c3770f","Type":"ContainerStarted","Data":"38219c763b7e644124f8e52b3f5f87d3e52ccba0d640d43e2d87e2ffacc1e096"} Mar 10 17:12:55 crc kubenswrapper[4749]: I0310 17:12:55.043061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a89a76a3-3810-431e-8061-a35fec1eff52","Type":"ContainerStarted","Data":"c5feaa8a26e164e3a3465648e75323ce73bfade11bbaea082d0519294b5545bf"} Mar 10 17:12:55 crc kubenswrapper[4749]: I0310 17:12:55.045662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ffde1ba-19eb-4d74-84b7-f64e82c3770f","Type":"ContainerStarted","Data":"824d1c884d7e2888c9c4799dcaf3c117613e0c773f4558224e8f425a3f6e64ac"} Mar 10 17:13:20 crc kubenswrapper[4749]: I0310 17:13:20.981200 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:13:20 crc kubenswrapper[4749]: I0310 17:13:20.981808 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:13:27 crc kubenswrapper[4749]: I0310 17:13:27.333161 4749 generic.go:334] "Generic (PLEG): container finished" podID="a89a76a3-3810-431e-8061-a35fec1eff52" containerID="c5feaa8a26e164e3a3465648e75323ce73bfade11bbaea082d0519294b5545bf" exitCode=0 Mar 10 17:13:27 crc kubenswrapper[4749]: I0310 17:13:27.333330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a89a76a3-3810-431e-8061-a35fec1eff52","Type":"ContainerDied","Data":"c5feaa8a26e164e3a3465648e75323ce73bfade11bbaea082d0519294b5545bf"} Mar 10 17:13:27 crc kubenswrapper[4749]: I0310 17:13:27.337158 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ffde1ba-19eb-4d74-84b7-f64e82c3770f" containerID="824d1c884d7e2888c9c4799dcaf3c117613e0c773f4558224e8f425a3f6e64ac" exitCode=0 Mar 10 17:13:27 crc kubenswrapper[4749]: I0310 17:13:27.337208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ffde1ba-19eb-4d74-84b7-f64e82c3770f","Type":"ContainerDied","Data":"824d1c884d7e2888c9c4799dcaf3c117613e0c773f4558224e8f425a3f6e64ac"} Mar 10 17:13:28 crc kubenswrapper[4749]: I0310 17:13:28.346683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a89a76a3-3810-431e-8061-a35fec1eff52","Type":"ContainerStarted","Data":"e6df4d7622378eee4380c3de32e0ebab035e2dd157e04a193a3be410e95a03a2"} Mar 10 17:13:28 crc kubenswrapper[4749]: I0310 17:13:28.347176 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:13:28 crc kubenswrapper[4749]: I0310 17:13:28.348701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ffde1ba-19eb-4d74-84b7-f64e82c3770f","Type":"ContainerStarted","Data":"ab1f2ea062d94ea333d60ed8cc0591ebd659614524238534b86ee3b453886a51"} Mar 10 17:13:28 crc kubenswrapper[4749]: I0310 17:13:28.348914 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 17:13:28 crc kubenswrapper[4749]: I0310 17:13:28.396214 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.396197877 podStartE2EDuration="36.396197877s" podCreationTimestamp="2026-03-10 17:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:13:28.392181648 +0000 UTC m=+5105.514047335" watchObservedRunningTime="2026-03-10 17:13:28.396197877 +0000 UTC m=+5105.518063564" Mar 10 17:13:28 crc kubenswrapper[4749]: I0310 17:13:28.397084 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.397077161 podStartE2EDuration="36.397077161s" podCreationTimestamp="2026-03-10 17:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:13:28.372136491 +0000 UTC m=+5105.494002178" watchObservedRunningTime="2026-03-10 17:13:28.397077161 +0000 UTC m=+5105.518942848" Mar 10 17:13:42 crc kubenswrapper[4749]: I0310 17:13:42.445685 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 17:13:42 crc kubenswrapper[4749]: I0310 17:13:42.702734 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.595775 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.597163 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.599866 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8w79f" Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.603929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.695292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8ptt\" (UniqueName: \"kubernetes.io/projected/6963d1c0-ba04-445f-b4f7-bac3b2164698-kube-api-access-h8ptt\") pod \"mariadb-client\" (UID: \"6963d1c0-ba04-445f-b4f7-bac3b2164698\") " pod="openstack/mariadb-client" Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.797062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8ptt\" (UniqueName: \"kubernetes.io/projected/6963d1c0-ba04-445f-b4f7-bac3b2164698-kube-api-access-h8ptt\") pod \"mariadb-client\" (UID: \"6963d1c0-ba04-445f-b4f7-bac3b2164698\") " pod="openstack/mariadb-client" Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.823837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8ptt\" (UniqueName: \"kubernetes.io/projected/6963d1c0-ba04-445f-b4f7-bac3b2164698-kube-api-access-h8ptt\") pod \"mariadb-client\" (UID: \"6963d1c0-ba04-445f-b4f7-bac3b2164698\") " pod="openstack/mariadb-client" Mar 10 17:13:45 crc kubenswrapper[4749]: I0310 17:13:45.920203 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:13:46 crc kubenswrapper[4749]: I0310 17:13:46.447138 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:13:46 crc kubenswrapper[4749]: I0310 17:13:46.499091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6963d1c0-ba04-445f-b4f7-bac3b2164698","Type":"ContainerStarted","Data":"c6ad7fb1752741ecc75fdc6f6bbaf7579933b9181b6d8f3c4c20b143aedab3ce"} Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.654737 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-skjw5"] Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.656904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.667329 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-skjw5"] Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.775133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-utilities\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.775254 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-catalog-content\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.775336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvsnf\" (UniqueName: \"kubernetes.io/projected/c2b63c56-5326-445e-872f-8f419dc8f4b0-kube-api-access-cvsnf\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.876212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvsnf\" (UniqueName: \"kubernetes.io/projected/c2b63c56-5326-445e-872f-8f419dc8f4b0-kube-api-access-cvsnf\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.876274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-utilities\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.876321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-catalog-content\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.876696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-catalog-content\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.876835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-utilities\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:49 crc kubenswrapper[4749]: I0310 17:13:49.907512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvsnf\" (UniqueName: \"kubernetes.io/projected/c2b63c56-5326-445e-872f-8f419dc8f4b0-kube-api-access-cvsnf\") pod \"certified-operators-skjw5\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:50 crc kubenswrapper[4749]: I0310 17:13:50.012351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:13:50 crc kubenswrapper[4749]: I0310 17:13:50.980561 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:13:50 crc kubenswrapper[4749]: I0310 17:13:50.980861 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:13:54 crc kubenswrapper[4749]: I0310 17:13:54.583016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6963d1c0-ba04-445f-b4f7-bac3b2164698","Type":"ContainerStarted","Data":"b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b"} Mar 10 17:13:54 crc kubenswrapper[4749]: I0310 17:13:54.607963 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-skjw5"] Mar 10 17:13:54 crc kubenswrapper[4749]: I0310 17:13:54.609915 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.7574441840000001 podStartE2EDuration="9.609900306s" podCreationTimestamp="2026-03-10 17:13:45 +0000 UTC" firstStartedPulling="2026-03-10 17:13:46.442594114 +0000 UTC m=+5123.564459801" lastFinishedPulling="2026-03-10 17:13:54.295050236 +0000 UTC m=+5131.416915923" observedRunningTime="2026-03-10 17:13:54.597198131 +0000 UTC m=+5131.719063818" watchObservedRunningTime="2026-03-10 17:13:54.609900306 +0000 UTC m=+5131.731765993" Mar 10 17:13:54 crc kubenswrapper[4749]: W0310 17:13:54.617791 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b63c56_5326_445e_872f_8f419dc8f4b0.slice/crio-5dd7c95c083c28300d0f04e20cdbde1a92486309f7d971b4865f0d42a3d9a34d WatchSource:0}: Error finding container 5dd7c95c083c28300d0f04e20cdbde1a92486309f7d971b4865f0d42a3d9a34d: Status 404 returned error can't find the container with id 5dd7c95c083c28300d0f04e20cdbde1a92486309f7d971b4865f0d42a3d9a34d Mar 10 17:13:55 crc kubenswrapper[4749]: I0310 17:13:55.591194 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerID="c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd" exitCode=0 Mar 10 17:13:55 crc kubenswrapper[4749]: I0310 17:13:55.591303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skjw5" event={"ID":"c2b63c56-5326-445e-872f-8f419dc8f4b0","Type":"ContainerDied","Data":"c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd"} Mar 10 17:13:55 crc kubenswrapper[4749]: I0310 17:13:55.591576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skjw5" event={"ID":"c2b63c56-5326-445e-872f-8f419dc8f4b0","Type":"ContainerStarted","Data":"5dd7c95c083c28300d0f04e20cdbde1a92486309f7d971b4865f0d42a3d9a34d"} Mar 10 17:13:56 crc kubenswrapper[4749]: E0310 17:13:56.958446 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b63c56_5326_445e_872f_8f419dc8f4b0.slice/crio-conmon-11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705.scope\": RecentStats: unable to find data in memory cache]" Mar 10 17:13:57 crc kubenswrapper[4749]: I0310 17:13:57.608100 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerID="11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705" exitCode=0 Mar 10 17:13:57 crc kubenswrapper[4749]: I0310 17:13:57.622871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skjw5" event={"ID":"c2b63c56-5326-445e-872f-8f419dc8f4b0","Type":"ContainerDied","Data":"11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705"} Mar 10 17:13:59 crc kubenswrapper[4749]: I0310 17:13:59.622111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skjw5" event={"ID":"c2b63c56-5326-445e-872f-8f419dc8f4b0","Type":"ContainerStarted","Data":"bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c"} Mar 10 17:13:59 crc kubenswrapper[4749]: I0310 17:13:59.652861 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-skjw5" podStartSLOduration=7.74619964 podStartE2EDuration="10.652837128s" podCreationTimestamp="2026-03-10 17:13:49 +0000 UTC" firstStartedPulling="2026-03-10 17:13:55.592844896 +0000 UTC m=+5132.714710583" lastFinishedPulling="2026-03-10 17:13:58.499482384 +0000 UTC m=+5135.621348071" observedRunningTime="2026-03-10 17:13:59.646992198 +0000 UTC m=+5136.768857895" watchObservedRunningTime="2026-03-10 17:13:59.652837128 +0000 UTC m=+5136.774702815" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.013055 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.013130 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.141831 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552714-ck9tm"] Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.143287 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.150472 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.150734 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.152050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.164565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552714-ck9tm"] Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.243866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d9fh\" (UniqueName: \"kubernetes.io/projected/dacc0ccc-aa71-4852-9518-e718f3491b86-kube-api-access-8d9fh\") pod \"auto-csr-approver-29552714-ck9tm\" (UID: \"dacc0ccc-aa71-4852-9518-e718f3491b86\") " pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.345416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d9fh\" (UniqueName: \"kubernetes.io/projected/dacc0ccc-aa71-4852-9518-e718f3491b86-kube-api-access-8d9fh\") pod \"auto-csr-approver-29552714-ck9tm\" (UID: \"dacc0ccc-aa71-4852-9518-e718f3491b86\") " pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.367115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d9fh\" (UniqueName: \"kubernetes.io/projected/dacc0ccc-aa71-4852-9518-e718f3491b86-kube-api-access-8d9fh\") pod \"auto-csr-approver-29552714-ck9tm\" (UID: \"dacc0ccc-aa71-4852-9518-e718f3491b86\") " pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.463869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:00 crc kubenswrapper[4749]: I0310 17:14:00.783084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552714-ck9tm"] Mar 10 17:14:01 crc kubenswrapper[4749]: I0310 17:14:01.062885 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-skjw5" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="registry-server" probeResult="failure" output=< Mar 10 17:14:01 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 17:14:01 crc kubenswrapper[4749]: > Mar 10 17:14:01 crc kubenswrapper[4749]: I0310 17:14:01.643337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" event={"ID":"dacc0ccc-aa71-4852-9518-e718f3491b86","Type":"ContainerStarted","Data":"f69c9d4516e916f56b769a82da3b65a1f0a77a232c83e01898960cf7b8d644d1"} Mar 10 17:14:02 crc kubenswrapper[4749]: I0310 17:14:02.657218 4749 generic.go:334] "Generic (PLEG): container finished" podID="dacc0ccc-aa71-4852-9518-e718f3491b86" containerID="512fd9b6b90f9558e3b5ceb36d336b3da56f827d0695b9132c302430e3574925" exitCode=0 Mar 10 17:14:02 crc kubenswrapper[4749]: I0310 17:14:02.657282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" event={"ID":"dacc0ccc-aa71-4852-9518-e718f3491b86","Type":"ContainerDied","Data":"512fd9b6b90f9558e3b5ceb36d336b3da56f827d0695b9132c302430e3574925"} Mar 10 17:14:03 crc kubenswrapper[4749]: I0310 17:14:03.996819 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:04 crc kubenswrapper[4749]: I0310 17:14:04.106420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d9fh\" (UniqueName: \"kubernetes.io/projected/dacc0ccc-aa71-4852-9518-e718f3491b86-kube-api-access-8d9fh\") pod \"dacc0ccc-aa71-4852-9518-e718f3491b86\" (UID: \"dacc0ccc-aa71-4852-9518-e718f3491b86\") " Mar 10 17:14:04 crc kubenswrapper[4749]: I0310 17:14:04.111996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dacc0ccc-aa71-4852-9518-e718f3491b86-kube-api-access-8d9fh" (OuterVolumeSpecName: "kube-api-access-8d9fh") pod "dacc0ccc-aa71-4852-9518-e718f3491b86" (UID: "dacc0ccc-aa71-4852-9518-e718f3491b86"). InnerVolumeSpecName "kube-api-access-8d9fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:14:04 crc kubenswrapper[4749]: I0310 17:14:04.208818 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d9fh\" (UniqueName: \"kubernetes.io/projected/dacc0ccc-aa71-4852-9518-e718f3491b86-kube-api-access-8d9fh\") on node \"crc\" DevicePath \"\"" Mar 10 17:14:04 crc kubenswrapper[4749]: I0310 17:14:04.677900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" event={"ID":"dacc0ccc-aa71-4852-9518-e718f3491b86","Type":"ContainerDied","Data":"f69c9d4516e916f56b769a82da3b65a1f0a77a232c83e01898960cf7b8d644d1"} Mar 10 17:14:04 crc kubenswrapper[4749]: I0310 17:14:04.677940 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69c9d4516e916f56b769a82da3b65a1f0a77a232c83e01898960cf7b8d644d1" Mar 10 17:14:04 crc kubenswrapper[4749]: I0310 17:14:04.678028 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552714-ck9tm" Mar 10 17:14:05 crc kubenswrapper[4749]: I0310 17:14:05.061391 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552708-lt2mw"] Mar 10 17:14:05 crc kubenswrapper[4749]: I0310 17:14:05.066806 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552708-lt2mw"] Mar 10 17:14:05 crc kubenswrapper[4749]: I0310 17:14:05.620132 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1093e396-a552-4168-a523-77c17d2f5f81" path="/var/lib/kubelet/pods/1093e396-a552-4168-a523-77c17d2f5f81/volumes" Mar 10 17:14:06 crc kubenswrapper[4749]: I0310 17:14:06.877079 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:14:06 crc kubenswrapper[4749]: I0310 17:14:06.877269 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="6963d1c0-ba04-445f-b4f7-bac3b2164698" containerName="mariadb-client" containerID="cri-o://b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b" gracePeriod=30 Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.402489 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.458336 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8ptt\" (UniqueName: \"kubernetes.io/projected/6963d1c0-ba04-445f-b4f7-bac3b2164698-kube-api-access-h8ptt\") pod \"6963d1c0-ba04-445f-b4f7-bac3b2164698\" (UID: \"6963d1c0-ba04-445f-b4f7-bac3b2164698\") " Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.464633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6963d1c0-ba04-445f-b4f7-bac3b2164698-kube-api-access-h8ptt" (OuterVolumeSpecName: "kube-api-access-h8ptt") pod "6963d1c0-ba04-445f-b4f7-bac3b2164698" (UID: "6963d1c0-ba04-445f-b4f7-bac3b2164698"). InnerVolumeSpecName "kube-api-access-h8ptt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.560793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8ptt\" (UniqueName: \"kubernetes.io/projected/6963d1c0-ba04-445f-b4f7-bac3b2164698-kube-api-access-h8ptt\") on node \"crc\" DevicePath \"\"" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.709133 4749 generic.go:334] "Generic (PLEG): container finished" podID="6963d1c0-ba04-445f-b4f7-bac3b2164698" containerID="b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b" exitCode=143 Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.709176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6963d1c0-ba04-445f-b4f7-bac3b2164698","Type":"ContainerDied","Data":"b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b"} Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.709203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6963d1c0-ba04-445f-b4f7-bac3b2164698","Type":"ContainerDied","Data":"c6ad7fb1752741ecc75fdc6f6bbaf7579933b9181b6d8f3c4c20b143aedab3ce"} Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.709209 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.709218 4749 scope.go:117] "RemoveContainer" containerID="b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.730018 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.732435 4749 scope.go:117] "RemoveContainer" containerID="b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b" Mar 10 17:14:07 crc kubenswrapper[4749]: E0310 17:14:07.732995 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b\": container with ID starting with b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b not found: ID does not exist" containerID="b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.733023 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b"} err="failed to get container status \"b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b\": rpc error: code = NotFound desc = could not find container \"b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b\": container with ID starting with b7799d55614767bf822b7a648f605d3373d27afba2ff8b5efb0465e637b5f84b not found: ID does not exist" Mar 10 17:14:07 crc kubenswrapper[4749]: I0310 17:14:07.735456 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:14:09 crc kubenswrapper[4749]: I0310 17:14:09.622801 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6963d1c0-ba04-445f-b4f7-bac3b2164698" path="/var/lib/kubelet/pods/6963d1c0-ba04-445f-b4f7-bac3b2164698/volumes" Mar 10 17:14:10 crc kubenswrapper[4749]: I0310 17:14:10.088819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:14:10 crc kubenswrapper[4749]: I0310 17:14:10.147162 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:14:10 crc kubenswrapper[4749]: I0310 17:14:10.329772 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-skjw5"] Mar 10 17:14:11 crc kubenswrapper[4749]: I0310 17:14:11.738485 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-skjw5" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="registry-server" containerID="cri-o://bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c" gracePeriod=2 Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.197865 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.240837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-catalog-content\") pod \"c2b63c56-5326-445e-872f-8f419dc8f4b0\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.240898 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvsnf\" (UniqueName: \"kubernetes.io/projected/c2b63c56-5326-445e-872f-8f419dc8f4b0-kube-api-access-cvsnf\") pod \"c2b63c56-5326-445e-872f-8f419dc8f4b0\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.241002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-utilities\") pod \"c2b63c56-5326-445e-872f-8f419dc8f4b0\" (UID: \"c2b63c56-5326-445e-872f-8f419dc8f4b0\") " Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.242711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-utilities" (OuterVolumeSpecName: "utilities") pod "c2b63c56-5326-445e-872f-8f419dc8f4b0" (UID: "c2b63c56-5326-445e-872f-8f419dc8f4b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.250556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b63c56-5326-445e-872f-8f419dc8f4b0-kube-api-access-cvsnf" (OuterVolumeSpecName: "kube-api-access-cvsnf") pod "c2b63c56-5326-445e-872f-8f419dc8f4b0" (UID: "c2b63c56-5326-445e-872f-8f419dc8f4b0"). InnerVolumeSpecName "kube-api-access-cvsnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.303504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2b63c56-5326-445e-872f-8f419dc8f4b0" (UID: "c2b63c56-5326-445e-872f-8f419dc8f4b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.342162 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.342232 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvsnf\" (UniqueName: \"kubernetes.io/projected/c2b63c56-5326-445e-872f-8f419dc8f4b0-kube-api-access-cvsnf\") on node \"crc\" DevicePath \"\"" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.342243 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2b63c56-5326-445e-872f-8f419dc8f4b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.748142 4749 generic.go:334] "Generic (PLEG): container finished" podID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerID="bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c" exitCode=0 Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.748224 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skjw5" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.748246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skjw5" event={"ID":"c2b63c56-5326-445e-872f-8f419dc8f4b0","Type":"ContainerDied","Data":"bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c"} Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.749674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skjw5" event={"ID":"c2b63c56-5326-445e-872f-8f419dc8f4b0","Type":"ContainerDied","Data":"5dd7c95c083c28300d0f04e20cdbde1a92486309f7d971b4865f0d42a3d9a34d"} Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.749696 4749 scope.go:117] "RemoveContainer" containerID="bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.774350 4749 scope.go:117] "RemoveContainer" containerID="11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.797636 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-skjw5"] Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.801239 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-skjw5"] Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.813168 4749 scope.go:117] "RemoveContainer" containerID="c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.831754 4749 scope.go:117] "RemoveContainer" containerID="bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c" Mar 10 17:14:12 crc kubenswrapper[4749]: E0310 17:14:12.832105 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c\": container with ID starting with bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c not found: ID does not exist" containerID="bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.832158 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c"} err="failed to get container status \"bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c\": rpc error: code = NotFound desc = could not find container \"bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c\": container with ID starting with bd2b3a7bee621d59901f8edd70209bc8d562c126fb5decb9d04ecfed422dec4c not found: ID does not exist" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.832189 4749 scope.go:117] "RemoveContainer" containerID="11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705" Mar 10 17:14:12 crc kubenswrapper[4749]: E0310 17:14:12.832577 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705\": container with ID starting with 11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705 not found: ID does not exist" containerID="11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.832619 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705"} err="failed to get container status \"11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705\": rpc error: code = NotFound desc = could not find container \"11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705\": container with ID starting with 11003e296349ad0ba39ead76c3aad4ec192b6d757ff652194204fe47db027705 not found: ID does not exist" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.832645 4749 scope.go:117] "RemoveContainer" containerID="c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd" Mar 10 17:14:12 crc kubenswrapper[4749]: E0310 17:14:12.833368 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd\": container with ID starting with c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd not found: ID does not exist" containerID="c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd" Mar 10 17:14:12 crc kubenswrapper[4749]: I0310 17:14:12.833663 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd"} err="failed to get container status \"c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd\": rpc error: code = NotFound desc = could not find container \"c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd\": container with ID starting with c3b6b5d949a94ff3e89e123e09c4f50385a1ffaf438fa00b911adb39dc4fb8bd not found: ID does not exist" Mar 10 17:14:13 crc kubenswrapper[4749]: I0310 17:14:13.617694 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" path="/var/lib/kubelet/pods/c2b63c56-5326-445e-872f-8f419dc8f4b0/volumes" Mar 10 17:14:20 crc kubenswrapper[4749]: I0310 17:14:20.980659 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:14:20 crc kubenswrapper[4749]: I0310 17:14:20.981065 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:14:20 crc kubenswrapper[4749]: I0310 17:14:20.981101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:14:20 crc kubenswrapper[4749]: I0310 17:14:20.981593 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3dc6824f04ac0b657b53434f59fc2f1cf018f8ac81376732d74b1a55125ca1c7"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:14:20 crc kubenswrapper[4749]: I0310 17:14:20.981647 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://3dc6824f04ac0b657b53434f59fc2f1cf018f8ac81376732d74b1a55125ca1c7" gracePeriod=600 Mar 10 17:14:21 crc kubenswrapper[4749]: I0310 17:14:21.828847 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="3dc6824f04ac0b657b53434f59fc2f1cf018f8ac81376732d74b1a55125ca1c7" exitCode=0 Mar 10 17:14:21 crc kubenswrapper[4749]: I0310 17:14:21.828909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"3dc6824f04ac0b657b53434f59fc2f1cf018f8ac81376732d74b1a55125ca1c7"} Mar 10 17:14:21 crc kubenswrapper[4749]: I0310 17:14:21.829189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8"} Mar 10 17:14:21 crc kubenswrapper[4749]: I0310 17:14:21.829214 4749 scope.go:117] "RemoveContainer" containerID="14f623813ec3211f75e5f85e8e21a1be2ac1d2efbeb2e9276a099688f851bd6c" Mar 10 17:14:27 crc kubenswrapper[4749]: I0310 17:14:27.115559 4749 scope.go:117] "RemoveContainer" containerID="8654cb0abfa992576dc7f097aa0d5c9d19b4c667ea83e12f917c66b7513ef295" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.162527 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5"] Mar 10 17:15:00 crc kubenswrapper[4749]: E0310 17:15:00.163536 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dacc0ccc-aa71-4852-9518-e718f3491b86" containerName="oc" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.163554 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dacc0ccc-aa71-4852-9518-e718f3491b86" containerName="oc" Mar 10 17:15:00 crc kubenswrapper[4749]: E0310 17:15:00.163573 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6963d1c0-ba04-445f-b4f7-bac3b2164698" containerName="mariadb-client" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.163594 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6963d1c0-ba04-445f-b4f7-bac3b2164698" containerName="mariadb-client" Mar 10 17:15:00 crc kubenswrapper[4749]: E0310 17:15:00.163613 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="registry-server" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.163619 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="registry-server" Mar 10 17:15:00 crc kubenswrapper[4749]: E0310 17:15:00.163640 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="extract-utilities" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.163646 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="extract-utilities" Mar 10 17:15:00 crc kubenswrapper[4749]: E0310 17:15:00.163658 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="extract-content" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.163668 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="extract-content" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.164039 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6963d1c0-ba04-445f-b4f7-bac3b2164698" containerName="mariadb-client" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.164105 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b63c56-5326-445e-872f-8f419dc8f4b0" containerName="registry-server" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.164128 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dacc0ccc-aa71-4852-9518-e718f3491b86" containerName="oc" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.165043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.166585 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.168838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.175179 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5"] Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.317708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63333664-d454-482c-b234-b53244a51e15-config-volume\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.318035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5spk\" (UniqueName: \"kubernetes.io/projected/63333664-d454-482c-b234-b53244a51e15-kube-api-access-q5spk\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.318163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63333664-d454-482c-b234-b53244a51e15-secret-volume\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.419782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63333664-d454-482c-b234-b53244a51e15-config-volume\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.419831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5spk\" (UniqueName: \"kubernetes.io/projected/63333664-d454-482c-b234-b53244a51e15-kube-api-access-q5spk\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.419887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63333664-d454-482c-b234-b53244a51e15-secret-volume\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.422259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63333664-d454-482c-b234-b53244a51e15-config-volume\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.431741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63333664-d454-482c-b234-b53244a51e15-secret-volume\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.441444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5spk\" (UniqueName: \"kubernetes.io/projected/63333664-d454-482c-b234-b53244a51e15-kube-api-access-q5spk\") pod \"collect-profiles-29552715-bpkq5\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.496891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:00 crc kubenswrapper[4749]: I0310 17:15:00.919152 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5"] Mar 10 17:15:01 crc kubenswrapper[4749]: I0310 17:15:01.174508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" event={"ID":"63333664-d454-482c-b234-b53244a51e15","Type":"ContainerStarted","Data":"a20264f23643583ed2adbe32d1babbfacc0f8385fa5867690efee2a36d9611f4"} Mar 10 17:15:01 crc kubenswrapper[4749]: I0310 17:15:01.174550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" event={"ID":"63333664-d454-482c-b234-b53244a51e15","Type":"ContainerStarted","Data":"247d79751cbb9801803e2a20a6b424b8c03fe6418616bfce2902618fed966e73"} Mar 10 17:15:01 crc kubenswrapper[4749]: I0310 17:15:01.195639 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" podStartSLOduration=1.195622057 podStartE2EDuration="1.195622057s" podCreationTimestamp="2026-03-10 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:15:01.195100363 +0000 UTC m=+5198.316966050" watchObservedRunningTime="2026-03-10 17:15:01.195622057 +0000 UTC m=+5198.317487744" Mar 10 17:15:02 crc kubenswrapper[4749]: I0310 17:15:02.182352 4749 generic.go:334] "Generic (PLEG): container finished" podID="63333664-d454-482c-b234-b53244a51e15" containerID="a20264f23643583ed2adbe32d1babbfacc0f8385fa5867690efee2a36d9611f4" exitCode=0 Mar 10 17:15:02 crc kubenswrapper[4749]: I0310 17:15:02.182409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" event={"ID":"63333664-d454-482c-b234-b53244a51e15","Type":"ContainerDied","Data":"a20264f23643583ed2adbe32d1babbfacc0f8385fa5867690efee2a36d9611f4"} Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.539781 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.665810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5spk\" (UniqueName: \"kubernetes.io/projected/63333664-d454-482c-b234-b53244a51e15-kube-api-access-q5spk\") pod \"63333664-d454-482c-b234-b53244a51e15\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.665940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63333664-d454-482c-b234-b53244a51e15-secret-volume\") pod \"63333664-d454-482c-b234-b53244a51e15\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.665983 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63333664-d454-482c-b234-b53244a51e15-config-volume\") pod \"63333664-d454-482c-b234-b53244a51e15\" (UID: \"63333664-d454-482c-b234-b53244a51e15\") " Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.666781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63333664-d454-482c-b234-b53244a51e15-config-volume" (OuterVolumeSpecName: "config-volume") pod "63333664-d454-482c-b234-b53244a51e15" (UID: "63333664-d454-482c-b234-b53244a51e15"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.672079 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63333664-d454-482c-b234-b53244a51e15-kube-api-access-q5spk" (OuterVolumeSpecName: "kube-api-access-q5spk") pod "63333664-d454-482c-b234-b53244a51e15" (UID: "63333664-d454-482c-b234-b53244a51e15"). InnerVolumeSpecName "kube-api-access-q5spk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.673558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63333664-d454-482c-b234-b53244a51e15-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "63333664-d454-482c-b234-b53244a51e15" (UID: "63333664-d454-482c-b234-b53244a51e15"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.767497 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5spk\" (UniqueName: \"kubernetes.io/projected/63333664-d454-482c-b234-b53244a51e15-kube-api-access-q5spk\") on node \"crc\" DevicePath \"\"" Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.767536 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63333664-d454-482c-b234-b53244a51e15-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:15:03 crc kubenswrapper[4749]: I0310 17:15:03.767547 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63333664-d454-482c-b234-b53244a51e15-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:15:04 crc kubenswrapper[4749]: I0310 17:15:04.201357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" event={"ID":"63333664-d454-482c-b234-b53244a51e15","Type":"ContainerDied","Data":"247d79751cbb9801803e2a20a6b424b8c03fe6418616bfce2902618fed966e73"} Mar 10 17:15:04 crc kubenswrapper[4749]: I0310 17:15:04.201432 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="247d79751cbb9801803e2a20a6b424b8c03fe6418616bfce2902618fed966e73" Mar 10 17:15:04 crc kubenswrapper[4749]: I0310 17:15:04.201468 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552715-bpkq5" Mar 10 17:15:04 crc kubenswrapper[4749]: I0310 17:15:04.284891 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c"] Mar 10 17:15:04 crc kubenswrapper[4749]: I0310 17:15:04.293292 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552670-9qs7c"] Mar 10 17:15:05 crc kubenswrapper[4749]: I0310 17:15:05.617888 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62954212-e121-40d6-aa39-972fdb4f2873" path="/var/lib/kubelet/pods/62954212-e121-40d6-aa39-972fdb4f2873/volumes" Mar 10 17:15:27 crc kubenswrapper[4749]: I0310 17:15:27.227500 4749 scope.go:117] "RemoveContainer" containerID="80e8d4c67c173dbcbf27ff2986cc013adfe74daa53227fa451f438c349b95ddc" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.150006 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552716-rnffm"] Mar 10 17:16:00 crc kubenswrapper[4749]: E0310 17:16:00.150754 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63333664-d454-482c-b234-b53244a51e15" containerName="collect-profiles" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.150840 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="63333664-d454-482c-b234-b53244a51e15" containerName="collect-profiles" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.151037 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="63333664-d454-482c-b234-b53244a51e15" containerName="collect-profiles" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.151602 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.153800 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.154278 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.154530 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.167589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsts5\" (UniqueName: \"kubernetes.io/projected/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7-kube-api-access-nsts5\") pod \"auto-csr-approver-29552716-rnffm\" (UID: \"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7\") " pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.169159 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552716-rnffm"] Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.269150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsts5\" (UniqueName: \"kubernetes.io/projected/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7-kube-api-access-nsts5\") pod \"auto-csr-approver-29552716-rnffm\" (UID: \"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7\") " pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.291305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsts5\" (UniqueName: \"kubernetes.io/projected/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7-kube-api-access-nsts5\") pod \"auto-csr-approver-29552716-rnffm\" (UID: \"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7\") " pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.477573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:00 crc kubenswrapper[4749]: I0310 17:16:00.942746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552716-rnffm"] Mar 10 17:16:00 crc kubenswrapper[4749]: W0310 17:16:00.947606 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0ed8b50_c65b_4d45_88f8_a5ba25b277c7.slice/crio-b96c43123df687ba1f9d06892a83f29101439c6243bd686bd5f19f2d28ff596f WatchSource:0}: Error finding container b96c43123df687ba1f9d06892a83f29101439c6243bd686bd5f19f2d28ff596f: Status 404 returned error can't find the container with id b96c43123df687ba1f9d06892a83f29101439c6243bd686bd5f19f2d28ff596f Mar 10 17:16:01 crc kubenswrapper[4749]: I0310 17:16:01.662783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552716-rnffm" event={"ID":"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7","Type":"ContainerStarted","Data":"b96c43123df687ba1f9d06892a83f29101439c6243bd686bd5f19f2d28ff596f"} Mar 10 17:16:02 crc kubenswrapper[4749]: I0310 17:16:02.674472 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0ed8b50-c65b-4d45-88f8-a5ba25b277c7" containerID="b357b1fbcc0cb613556e38f737fa7dab83e3e108a6f7630b0ef078d622e82ef6" exitCode=0 Mar 10 17:16:02 crc kubenswrapper[4749]: I0310 17:16:02.674604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552716-rnffm" event={"ID":"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7","Type":"ContainerDied","Data":"b357b1fbcc0cb613556e38f737fa7dab83e3e108a6f7630b0ef078d622e82ef6"} Mar 10 17:16:03 crc kubenswrapper[4749]: I0310 17:16:03.985435 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:04 crc kubenswrapper[4749]: I0310 17:16:04.024090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsts5\" (UniqueName: \"kubernetes.io/projected/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7-kube-api-access-nsts5\") pod \"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7\" (UID: \"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7\") " Mar 10 17:16:04 crc kubenswrapper[4749]: I0310 17:16:04.029571 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7-kube-api-access-nsts5" (OuterVolumeSpecName: "kube-api-access-nsts5") pod "d0ed8b50-c65b-4d45-88f8-a5ba25b277c7" (UID: "d0ed8b50-c65b-4d45-88f8-a5ba25b277c7"). InnerVolumeSpecName "kube-api-access-nsts5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:16:04 crc kubenswrapper[4749]: I0310 17:16:04.126177 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsts5\" (UniqueName: \"kubernetes.io/projected/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7-kube-api-access-nsts5\") on node \"crc\" DevicePath \"\"" Mar 10 17:16:04 crc kubenswrapper[4749]: I0310 17:16:04.691743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552716-rnffm" event={"ID":"d0ed8b50-c65b-4d45-88f8-a5ba25b277c7","Type":"ContainerDied","Data":"b96c43123df687ba1f9d06892a83f29101439c6243bd686bd5f19f2d28ff596f"} Mar 10 17:16:04 crc kubenswrapper[4749]: I0310 17:16:04.691788 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96c43123df687ba1f9d06892a83f29101439c6243bd686bd5f19f2d28ff596f" Mar 10 17:16:04 crc kubenswrapper[4749]: I0310 17:16:04.691859 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552716-rnffm" Mar 10 17:16:05 crc kubenswrapper[4749]: I0310 17:16:05.059668 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552710-vnmvn"] Mar 10 17:16:05 crc kubenswrapper[4749]: I0310 17:16:05.067213 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552710-vnmvn"] Mar 10 17:16:05 crc kubenswrapper[4749]: I0310 17:16:05.616083 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4f3dcc-791c-4146-abdc-24b11d5a85fa" path="/var/lib/kubelet/pods/2f4f3dcc-791c-4146-abdc-24b11d5a85fa/volumes" Mar 10 17:16:27 crc kubenswrapper[4749]: I0310 17:16:27.275697 4749 scope.go:117] "RemoveContainer" containerID="ea0e01fd6cafbf133ff89f605773f476559a340189d659f4d017c62899b2278c" Mar 10 17:16:27 crc kubenswrapper[4749]: I0310 17:16:27.295206 4749 scope.go:117] "RemoveContainer" containerID="eceb9ebe21bdc849273f8d8896390da1d4cfedeca81205eafe0717718e8d3ffe" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.558705 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-th484"] Mar 10 17:16:35 crc kubenswrapper[4749]: E0310 17:16:35.559682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ed8b50-c65b-4d45-88f8-a5ba25b277c7" containerName="oc" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.559706 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ed8b50-c65b-4d45-88f8-a5ba25b277c7" containerName="oc" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.559955 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ed8b50-c65b-4d45-88f8-a5ba25b277c7" containerName="oc" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.561329 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.573421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-th484"] Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.727002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6s2d\" (UniqueName: \"kubernetes.io/projected/3e7bae71-1d16-46c6-b731-33000e7a1f09-kube-api-access-z6s2d\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.727050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-utilities\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.727599 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-catalog-content\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.829665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-catalog-content\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.829771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6s2d\" (UniqueName: \"kubernetes.io/projected/3e7bae71-1d16-46c6-b731-33000e7a1f09-kube-api-access-z6s2d\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.829797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-utilities\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.830244 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-catalog-content\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.830316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-utilities\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.858366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6s2d\" (UniqueName: \"kubernetes.io/projected/3e7bae71-1d16-46c6-b731-33000e7a1f09-kube-api-access-z6s2d\") pod \"redhat-operators-th484\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:35 crc kubenswrapper[4749]: I0310 17:16:35.895303 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:36 crc kubenswrapper[4749]: I0310 17:16:36.377639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-th484"] Mar 10 17:16:36 crc kubenswrapper[4749]: I0310 17:16:36.967099 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerID="75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42" exitCode=0 Mar 10 17:16:36 crc kubenswrapper[4749]: I0310 17:16:36.967206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerDied","Data":"75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42"} Mar 10 17:16:36 crc kubenswrapper[4749]: I0310 17:16:36.967502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerStarted","Data":"8154e391c104fbe81459ae4188aad2254bf06946616ae03f6b3d3d93a26e213e"} Mar 10 17:16:37 crc kubenswrapper[4749]: I0310 17:16:37.978341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerStarted","Data":"99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e"} Mar 10 17:16:39 crc kubenswrapper[4749]: I0310 17:16:39.006998 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerID="99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e" exitCode=0 Mar 10 17:16:39 crc kubenswrapper[4749]: I0310 17:16:39.007143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerDied","Data":"99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e"} Mar 10 17:16:40 crc kubenswrapper[4749]: I0310 17:16:40.019914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerStarted","Data":"57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4"} Mar 10 17:16:40 crc kubenswrapper[4749]: I0310 17:16:40.052084 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-th484" podStartSLOduration=2.5953800080000002 podStartE2EDuration="5.052064153s" podCreationTimestamp="2026-03-10 17:16:35 +0000 UTC" firstStartedPulling="2026-03-10 17:16:36.968537114 +0000 UTC m=+5294.090402801" lastFinishedPulling="2026-03-10 17:16:39.425221259 +0000 UTC m=+5296.547086946" observedRunningTime="2026-03-10 17:16:40.048029863 +0000 UTC m=+5297.169895570" watchObservedRunningTime="2026-03-10 17:16:40.052064153 +0000 UTC m=+5297.173929850" Mar 10 17:16:45 crc kubenswrapper[4749]: I0310 17:16:45.896132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:45 crc kubenswrapper[4749]: I0310 17:16:45.896864 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:46 crc kubenswrapper[4749]: I0310 17:16:46.946056 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-th484" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="registry-server" probeResult="failure" output=< Mar 10 17:16:46 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 17:16:46 crc kubenswrapper[4749]: > Mar 10 17:16:50 crc kubenswrapper[4749]: I0310 17:16:50.981478 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:16:50 crc kubenswrapper[4749]: I0310 17:16:50.982634 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:16:55 crc kubenswrapper[4749]: I0310 17:16:55.953251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:56 crc kubenswrapper[4749]: I0310 17:16:56.020298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:56 crc kubenswrapper[4749]: I0310 17:16:56.216259 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-th484"] Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.156037 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-th484" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="registry-server" containerID="cri-o://57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4" gracePeriod=2 Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.655201 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.801550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6s2d\" (UniqueName: \"kubernetes.io/projected/3e7bae71-1d16-46c6-b731-33000e7a1f09-kube-api-access-z6s2d\") pod \"3e7bae71-1d16-46c6-b731-33000e7a1f09\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.801661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-utilities\") pod \"3e7bae71-1d16-46c6-b731-33000e7a1f09\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.801731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-catalog-content\") pod \"3e7bae71-1d16-46c6-b731-33000e7a1f09\" (UID: \"3e7bae71-1d16-46c6-b731-33000e7a1f09\") " Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.802985 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-utilities" (OuterVolumeSpecName: "utilities") pod "3e7bae71-1d16-46c6-b731-33000e7a1f09" (UID: "3e7bae71-1d16-46c6-b731-33000e7a1f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.810614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7bae71-1d16-46c6-b731-33000e7a1f09-kube-api-access-z6s2d" (OuterVolumeSpecName: "kube-api-access-z6s2d") pod "3e7bae71-1d16-46c6-b731-33000e7a1f09" (UID: "3e7bae71-1d16-46c6-b731-33000e7a1f09"). InnerVolumeSpecName "kube-api-access-z6s2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.904755 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6s2d\" (UniqueName: \"kubernetes.io/projected/3e7bae71-1d16-46c6-b731-33000e7a1f09-kube-api-access-z6s2d\") on node \"crc\" DevicePath \"\"" Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.904824 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:16:57 crc kubenswrapper[4749]: I0310 17:16:57.967711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e7bae71-1d16-46c6-b731-33000e7a1f09" (UID: "3e7bae71-1d16-46c6-b731-33000e7a1f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.007190 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7bae71-1d16-46c6-b731-33000e7a1f09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.164219 4749 generic.go:334] "Generic (PLEG): container finished" podID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerID="57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4" exitCode=0 Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.164275 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th484" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.164293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerDied","Data":"57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4"} Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.165781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th484" event={"ID":"3e7bae71-1d16-46c6-b731-33000e7a1f09","Type":"ContainerDied","Data":"8154e391c104fbe81459ae4188aad2254bf06946616ae03f6b3d3d93a26e213e"} Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.165800 4749 scope.go:117] "RemoveContainer" containerID="57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.201406 4749 scope.go:117] "RemoveContainer" containerID="99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.205701 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-th484"] Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.214325 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-th484"] Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.225723 4749 scope.go:117] "RemoveContainer" containerID="75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.261152 4749 scope.go:117] "RemoveContainer" containerID="57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4" Mar 10 17:16:58 crc kubenswrapper[4749]: E0310 17:16:58.261551 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4\": container with ID starting with 57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4 not found: ID does not exist" containerID="57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.261589 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4"} err="failed to get container status \"57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4\": rpc error: code = NotFound desc = could not find container \"57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4\": container with ID starting with 57e2ea574d17f5f071a40103101154e3c3aa2c1f88b2c3c796148684879ddbf4 not found: ID does not exist" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.261615 4749 scope.go:117] "RemoveContainer" containerID="99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e" Mar 10 17:16:58 crc kubenswrapper[4749]: E0310 17:16:58.261836 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e\": container with ID starting with 99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e not found: ID does not exist" containerID="99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.261883 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e"} err="failed to get container status \"99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e\": rpc error: code = NotFound desc = could not find container \"99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e\": container with ID starting with 99835415527b133a820d2a6a1e913f06806eba744d04e8a6a78663b81f763b6e not found: ID does not exist" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.261902 4749 scope.go:117] "RemoveContainer" containerID="75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42" Mar 10 17:16:58 crc kubenswrapper[4749]: E0310 17:16:58.262115 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42\": container with ID starting with 75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42 not found: ID does not exist" containerID="75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42" Mar 10 17:16:58 crc kubenswrapper[4749]: I0310 17:16:58.262132 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42"} err="failed to get container status \"75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42\": rpc error: code = NotFound desc = could not find container \"75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42\": container with ID starting with 75770efd0d6c3c966473d0c898697403110b2d06b7ca0d3d848ee92484942b42 not found: ID does not exist" Mar 10 17:16:59 crc kubenswrapper[4749]: I0310 17:16:59.627107 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" path="/var/lib/kubelet/pods/3e7bae71-1d16-46c6-b731-33000e7a1f09/volumes" Mar 10 17:17:20 crc kubenswrapper[4749]: I0310 17:17:20.980806 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:17:20 crc kubenswrapper[4749]: I0310 17:17:20.981287 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.011540 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 17:17:22 crc kubenswrapper[4749]: E0310 17:17:22.012346 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="extract-content" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.012409 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="extract-content" Mar 10 17:17:22 crc kubenswrapper[4749]: E0310 17:17:22.012472 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="extract-utilities" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.012488 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="extract-utilities" Mar 10 17:17:22 crc kubenswrapper[4749]: E0310 17:17:22.012523 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="registry-server" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.012567 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="registry-server" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.012886 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7bae71-1d16-46c6-b731-33000e7a1f09" containerName="registry-server" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.014046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.018066 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8w79f" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.031712 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.088186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blqng\" (UniqueName: \"kubernetes.io/projected/166b2d31-2084-433f-9e16-2a3d865b687b-kube-api-access-blqng\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") " pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.088266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") " pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.189585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blqng\" (UniqueName: \"kubernetes.io/projected/166b2d31-2084-433f-9e16-2a3d865b687b-kube-api-access-blqng\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") " pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.189701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") " pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.193211 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.193252 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/50a7cda7357cbfc4fc58b33ca3d33d99136e928b9e817a28f701593160f19c6f/globalmount\"" pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.211666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blqng\" (UniqueName: \"kubernetes.io/projected/166b2d31-2084-433f-9e16-2a3d865b687b-kube-api-access-blqng\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") " pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.223227 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b4f58a7-2487-448f-b97b-9f0b66291bdd\") pod \"mariadb-copy-data\" (UID: \"166b2d31-2084-433f-9e16-2a3d865b687b\") " pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.346026 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 17:17:22 crc kubenswrapper[4749]: I0310 17:17:22.881616 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 17:17:22 crc kubenswrapper[4749]: W0310 17:17:22.884077 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod166b2d31_2084_433f_9e16_2a3d865b687b.slice/crio-ecaa7b0bdf2dbd1bfc1f6037ee86f2b8dccd7313f810d183df1558406a12c56e WatchSource:0}: Error finding container ecaa7b0bdf2dbd1bfc1f6037ee86f2b8dccd7313f810d183df1558406a12c56e: Status 404 returned error can't find the container with id ecaa7b0bdf2dbd1bfc1f6037ee86f2b8dccd7313f810d183df1558406a12c56e Mar 10 17:17:23 crc kubenswrapper[4749]: I0310 17:17:23.373701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"166b2d31-2084-433f-9e16-2a3d865b687b","Type":"ContainerStarted","Data":"7b641d29981536baed72022c24a89d14766843a54cc9e8ad9bfc6f25c9d988e8"} Mar 10 17:17:23 crc kubenswrapper[4749]: I0310 17:17:23.374111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"166b2d31-2084-433f-9e16-2a3d865b687b","Type":"ContainerStarted","Data":"ecaa7b0bdf2dbd1bfc1f6037ee86f2b8dccd7313f810d183df1558406a12c56e"} Mar 10 17:17:23 crc kubenswrapper[4749]: I0310 17:17:23.389649 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.389628766 podStartE2EDuration="3.389628766s" podCreationTimestamp="2026-03-10 17:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:17:23.387320603 +0000 UTC m=+5340.509186300" watchObservedRunningTime="2026-03-10 17:17:23.389628766 +0000 UTC m=+5340.511494453" Mar 10 17:17:25 crc kubenswrapper[4749]: I0310 17:17:25.860761 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:25 crc kubenswrapper[4749]: I0310 17:17:25.862578 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:25 crc kubenswrapper[4749]: I0310 17:17:25.870694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:25 crc kubenswrapper[4749]: I0310 17:17:25.947259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqlp\" (UniqueName: \"kubernetes.io/projected/d87cd6e7-3bee-490c-83dd-f7b25ebc7272-kube-api-access-ckqlp\") pod \"mariadb-client\" (UID: \"d87cd6e7-3bee-490c-83dd-f7b25ebc7272\") " pod="openstack/mariadb-client" Mar 10 17:17:26 crc kubenswrapper[4749]: I0310 17:17:26.048953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqlp\" (UniqueName: \"kubernetes.io/projected/d87cd6e7-3bee-490c-83dd-f7b25ebc7272-kube-api-access-ckqlp\") pod \"mariadb-client\" (UID: \"d87cd6e7-3bee-490c-83dd-f7b25ebc7272\") " pod="openstack/mariadb-client" Mar 10 17:17:26 crc kubenswrapper[4749]: I0310 17:17:26.069407 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqlp\" (UniqueName: \"kubernetes.io/projected/d87cd6e7-3bee-490c-83dd-f7b25ebc7272-kube-api-access-ckqlp\") pod \"mariadb-client\" (UID: \"d87cd6e7-3bee-490c-83dd-f7b25ebc7272\") " pod="openstack/mariadb-client" Mar 10 17:17:26 crc kubenswrapper[4749]: I0310 17:17:26.182819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:26 crc kubenswrapper[4749]: I0310 17:17:26.652876 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:26 crc kubenswrapper[4749]: W0310 17:17:26.663023 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd87cd6e7_3bee_490c_83dd_f7b25ebc7272.slice/crio-ebf284c5fccee9a5f89f96a53c62fcc81f5bba7cf8dd028798a7f62a64a47918 WatchSource:0}: Error finding container ebf284c5fccee9a5f89f96a53c62fcc81f5bba7cf8dd028798a7f62a64a47918: Status 404 returned error can't find the container with id ebf284c5fccee9a5f89f96a53c62fcc81f5bba7cf8dd028798a7f62a64a47918 Mar 10 17:17:27 crc kubenswrapper[4749]: I0310 17:17:27.403935 4749 generic.go:334] "Generic (PLEG): container finished" podID="d87cd6e7-3bee-490c-83dd-f7b25ebc7272" containerID="b5051e943235eadb455ac5ff670027f6ebb8544451fa21ee2d9b2a03435e1bde" exitCode=0 Mar 10 17:17:27 crc kubenswrapper[4749]: I0310 17:17:27.404035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d87cd6e7-3bee-490c-83dd-f7b25ebc7272","Type":"ContainerDied","Data":"b5051e943235eadb455ac5ff670027f6ebb8544451fa21ee2d9b2a03435e1bde"} Mar 10 17:17:27 crc kubenswrapper[4749]: I0310 17:17:27.404336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d87cd6e7-3bee-490c-83dd-f7b25ebc7272","Type":"ContainerStarted","Data":"ebf284c5fccee9a5f89f96a53c62fcc81f5bba7cf8dd028798a7f62a64a47918"} Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.685744 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.707305 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_d87cd6e7-3bee-490c-83dd-f7b25ebc7272/mariadb-client/0.log" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.729278 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.734018 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.785101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqlp\" (UniqueName: \"kubernetes.io/projected/d87cd6e7-3bee-490c-83dd-f7b25ebc7272-kube-api-access-ckqlp\") pod \"d87cd6e7-3bee-490c-83dd-f7b25ebc7272\" (UID: \"d87cd6e7-3bee-490c-83dd-f7b25ebc7272\") " Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.791819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d87cd6e7-3bee-490c-83dd-f7b25ebc7272-kube-api-access-ckqlp" (OuterVolumeSpecName: "kube-api-access-ckqlp") pod "d87cd6e7-3bee-490c-83dd-f7b25ebc7272" (UID: "d87cd6e7-3bee-490c-83dd-f7b25ebc7272"). InnerVolumeSpecName "kube-api-access-ckqlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.841013 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:28 crc kubenswrapper[4749]: E0310 17:17:28.841460 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d87cd6e7-3bee-490c-83dd-f7b25ebc7272" containerName="mariadb-client" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.841477 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d87cd6e7-3bee-490c-83dd-f7b25ebc7272" containerName="mariadb-client" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.841655 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d87cd6e7-3bee-490c-83dd-f7b25ebc7272" containerName="mariadb-client" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.842345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.848801 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.888502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d49dk\" (UniqueName: \"kubernetes.io/projected/99a82c38-69dd-497f-84af-a1d2ea1be10e-kube-api-access-d49dk\") pod \"mariadb-client\" (UID: \"99a82c38-69dd-497f-84af-a1d2ea1be10e\") " pod="openstack/mariadb-client" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.888679 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqlp\" (UniqueName: \"kubernetes.io/projected/d87cd6e7-3bee-490c-83dd-f7b25ebc7272-kube-api-access-ckqlp\") on node \"crc\" DevicePath \"\"" Mar 10 17:17:28 crc kubenswrapper[4749]: I0310 17:17:28.989360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d49dk\" (UniqueName: \"kubernetes.io/projected/99a82c38-69dd-497f-84af-a1d2ea1be10e-kube-api-access-d49dk\") pod \"mariadb-client\" (UID: \"99a82c38-69dd-497f-84af-a1d2ea1be10e\") " pod="openstack/mariadb-client" Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.009676 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d49dk\" (UniqueName: \"kubernetes.io/projected/99a82c38-69dd-497f-84af-a1d2ea1be10e-kube-api-access-d49dk\") pod \"mariadb-client\" (UID: \"99a82c38-69dd-497f-84af-a1d2ea1be10e\") " pod="openstack/mariadb-client" Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.168711 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.390072 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.424530 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.424544 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebf284c5fccee9a5f89f96a53c62fcc81f5bba7cf8dd028798a7f62a64a47918" Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.425953 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"99a82c38-69dd-497f-84af-a1d2ea1be10e","Type":"ContainerStarted","Data":"7f2587819d19e96754ff11f4936ea1035a2ee2965d82ce43355aac2fad58bfca"} Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.454596 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="d87cd6e7-3bee-490c-83dd-f7b25ebc7272" podUID="99a82c38-69dd-497f-84af-a1d2ea1be10e" Mar 10 17:17:29 crc kubenswrapper[4749]: I0310 17:17:29.615328 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d87cd6e7-3bee-490c-83dd-f7b25ebc7272" path="/var/lib/kubelet/pods/d87cd6e7-3bee-490c-83dd-f7b25ebc7272/volumes" Mar 10 17:17:30 crc kubenswrapper[4749]: I0310 17:17:30.436146 4749 generic.go:334] "Generic (PLEG): container finished" podID="99a82c38-69dd-497f-84af-a1d2ea1be10e" containerID="75757afc171bcafeae6bf124f6d16942d183e49c261d76d64096dd63f8262253" exitCode=0 Mar 10 17:17:30 crc kubenswrapper[4749]: I0310 17:17:30.436283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"99a82c38-69dd-497f-84af-a1d2ea1be10e","Type":"ContainerDied","Data":"75757afc171bcafeae6bf124f6d16942d183e49c261d76d64096dd63f8262253"} Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.752565 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.771537 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_99a82c38-69dd-497f-84af-a1d2ea1be10e/mariadb-client/0.log" Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.806286 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.815714 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.839660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d49dk\" (UniqueName: \"kubernetes.io/projected/99a82c38-69dd-497f-84af-a1d2ea1be10e-kube-api-access-d49dk\") pod \"99a82c38-69dd-497f-84af-a1d2ea1be10e\" (UID: \"99a82c38-69dd-497f-84af-a1d2ea1be10e\") " Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.846593 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a82c38-69dd-497f-84af-a1d2ea1be10e-kube-api-access-d49dk" (OuterVolumeSpecName: "kube-api-access-d49dk") pod "99a82c38-69dd-497f-84af-a1d2ea1be10e" (UID: "99a82c38-69dd-497f-84af-a1d2ea1be10e"). InnerVolumeSpecName "kube-api-access-d49dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:17:31 crc kubenswrapper[4749]: I0310 17:17:31.941752 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d49dk\" (UniqueName: \"kubernetes.io/projected/99a82c38-69dd-497f-84af-a1d2ea1be10e-kube-api-access-d49dk\") on node \"crc\" DevicePath \"\"" Mar 10 17:17:32 crc kubenswrapper[4749]: I0310 17:17:32.455177 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f2587819d19e96754ff11f4936ea1035a2ee2965d82ce43355aac2fad58bfca" Mar 10 17:17:32 crc kubenswrapper[4749]: I0310 17:17:32.455267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 17:17:33 crc kubenswrapper[4749]: I0310 17:17:33.620120 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a82c38-69dd-497f-84af-a1d2ea1be10e" path="/var/lib/kubelet/pods/99a82c38-69dd-497f-84af-a1d2ea1be10e/volumes" Mar 10 17:17:50 crc kubenswrapper[4749]: I0310 17:17:50.980789 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:17:50 crc kubenswrapper[4749]: I0310 17:17:50.981431 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:17:50 crc kubenswrapper[4749]: I0310 17:17:50.981509 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:17:50 crc kubenswrapper[4749]: I0310 17:17:50.982414 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:17:50 crc kubenswrapper[4749]: I0310 17:17:50.982508 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" gracePeriod=600 Mar 10 17:17:51 crc kubenswrapper[4749]: E0310 17:17:51.105162 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:17:51 crc kubenswrapper[4749]: I0310 17:17:51.600606 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" exitCode=0 Mar 10 17:17:51 crc kubenswrapper[4749]: I0310 17:17:51.600652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8"} Mar 10 17:17:51 crc kubenswrapper[4749]: I0310 17:17:51.600692 4749 scope.go:117] "RemoveContainer" containerID="3dc6824f04ac0b657b53434f59fc2f1cf018f8ac81376732d74b1a55125ca1c7" Mar 10 17:17:51 crc kubenswrapper[4749]: I0310 17:17:51.601334 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:17:51 crc kubenswrapper[4749]: E0310 17:17:51.601714 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.156070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552718-klb5c"] Mar 10 17:18:00 crc kubenswrapper[4749]: E0310 17:18:00.158143 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a82c38-69dd-497f-84af-a1d2ea1be10e" containerName="mariadb-client" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.158169 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a82c38-69dd-497f-84af-a1d2ea1be10e" containerName="mariadb-client" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.158405 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a82c38-69dd-497f-84af-a1d2ea1be10e" containerName="mariadb-client" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.159067 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.161298 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.161868 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.162796 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.165562 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552718-klb5c"] Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.302732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djlnb\" (UniqueName: \"kubernetes.io/projected/6a746a1c-dd59-4d02-a198-a9d8b239947d-kube-api-access-djlnb\") pod \"auto-csr-approver-29552718-klb5c\" (UID: \"6a746a1c-dd59-4d02-a198-a9d8b239947d\") " pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.404712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djlnb\" (UniqueName: \"kubernetes.io/projected/6a746a1c-dd59-4d02-a198-a9d8b239947d-kube-api-access-djlnb\") pod \"auto-csr-approver-29552718-klb5c\" (UID: \"6a746a1c-dd59-4d02-a198-a9d8b239947d\") " pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.435565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djlnb\" (UniqueName: \"kubernetes.io/projected/6a746a1c-dd59-4d02-a198-a9d8b239947d-kube-api-access-djlnb\") pod \"auto-csr-approver-29552718-klb5c\" (UID: \"6a746a1c-dd59-4d02-a198-a9d8b239947d\") " pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.477718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.909404 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552718-klb5c"] Mar 10 17:18:00 crc kubenswrapper[4749]: I0310 17:18:00.918973 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:18:01 crc kubenswrapper[4749]: I0310 17:18:01.674524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552718-klb5c" event={"ID":"6a746a1c-dd59-4d02-a198-a9d8b239947d","Type":"ContainerStarted","Data":"4b6a6e72826ab7bd86eb95c9e976d82b909019a01e8720cbef37ad5aeb4ee1de"} Mar 10 17:18:02 crc kubenswrapper[4749]: I0310 17:18:02.685318 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a746a1c-dd59-4d02-a198-a9d8b239947d" containerID="76764a01ec7406824069e546672633adbc7cfe5d6ad7479b0e0cfa2945251261" exitCode=0 Mar 10 17:18:02 crc kubenswrapper[4749]: I0310 17:18:02.685387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552718-klb5c" event={"ID":"6a746a1c-dd59-4d02-a198-a9d8b239947d","Type":"ContainerDied","Data":"76764a01ec7406824069e546672633adbc7cfe5d6ad7479b0e0cfa2945251261"} Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.021585 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.207803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djlnb\" (UniqueName: \"kubernetes.io/projected/6a746a1c-dd59-4d02-a198-a9d8b239947d-kube-api-access-djlnb\") pod \"6a746a1c-dd59-4d02-a198-a9d8b239947d\" (UID: \"6a746a1c-dd59-4d02-a198-a9d8b239947d\") " Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.215415 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a746a1c-dd59-4d02-a198-a9d8b239947d-kube-api-access-djlnb" (OuterVolumeSpecName: "kube-api-access-djlnb") pod "6a746a1c-dd59-4d02-a198-a9d8b239947d" (UID: "6a746a1c-dd59-4d02-a198-a9d8b239947d"). InnerVolumeSpecName "kube-api-access-djlnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.309628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djlnb\" (UniqueName: \"kubernetes.io/projected/6a746a1c-dd59-4d02-a198-a9d8b239947d-kube-api-access-djlnb\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.397765 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 17:18:04 crc kubenswrapper[4749]: E0310 17:18:04.398070 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a746a1c-dd59-4d02-a198-a9d8b239947d" containerName="oc" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.398101 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a746a1c-dd59-4d02-a198-a9d8b239947d" containerName="oc" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.398255 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a746a1c-dd59-4d02-a198-a9d8b239947d" containerName="oc" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.399026 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.401077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.401218 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.402128 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.402209 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2vm8h" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.402706 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.411605 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.425975 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.427228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.445420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.446704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.464427 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.472628 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6f1622-50c3-4a95-80d9-e833ddc6deba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6f1622-50c3-4a95-80d9-e833ddc6deba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2s5d\" (UniqueName: \"kubernetes.io/projected/bd6f1622-50c3-4a95-80d9-e833ddc6deba-kube-api-access-w2s5d\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6f1622-50c3-4a95-80d9-e833ddc6deba-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.512880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.614928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.614991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wk8p\" (UniqueName: \"kubernetes.io/projected/be6afa7c-b5a9-484f-8f55-705241c391dc-kube-api-access-2wk8p\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f932070d-6f40-4017-a1d2-cb205561989e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6f1622-50c3-4a95-80d9-e833ddc6deba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615214 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f932070d-6f40-4017-a1d2-cb205561989e-config\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6afa7c-b5a9-484f-8f55-705241c391dc-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be6afa7c-b5a9-484f-8f55-705241c391dc-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2s5d\" (UniqueName: \"kubernetes.io/projected/bd6f1622-50c3-4a95-80d9-e833ddc6deba-kube-api-access-w2s5d\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be6afa7c-b5a9-484f-8f55-705241c391dc-config\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6f1622-50c3-4a95-80d9-e833ddc6deba-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f932070d-6f40-4017-a1d2-cb205561989e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6f1622-50c3-4a95-80d9-e833ddc6deba-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56px\" (UniqueName: \"kubernetes.io/projected/f932070d-6f40-4017-a1d2-cb205561989e-kube-api-access-j56px\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6f1622-50c3-4a95-80d9-e833ddc6deba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.615987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.617027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6f1622-50c3-4a95-80d9-e833ddc6deba-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.618201 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6f1622-50c3-4a95-80d9-e833ddc6deba-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.622009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.622552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.630982 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.631024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1622-50c3-4a95-80d9-e833ddc6deba-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.631062 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/394b727492c290b264fa09559e85734bdcc24aa4929e0fce1e23a19e766e3065/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.635386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2s5d\" (UniqueName: \"kubernetes.io/projected/bd6f1622-50c3-4a95-80d9-e833ddc6deba-kube-api-access-w2s5d\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.676840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9b59ddf0-9922-456d-9307-01e8fa875b86\") pod \"ovsdbserver-nb-0\" (UID: \"bd6f1622-50c3-4a95-80d9-e833ddc6deba\") " pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.701643 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552718-klb5c" event={"ID":"6a746a1c-dd59-4d02-a198-a9d8b239947d","Type":"ContainerDied","Data":"4b6a6e72826ab7bd86eb95c9e976d82b909019a01e8720cbef37ad5aeb4ee1de"} Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.701690 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b6a6e72826ab7bd86eb95c9e976d82b909019a01e8720cbef37ad5aeb4ee1de" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.701751 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552718-klb5c" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wk8p\" (UniqueName: \"kubernetes.io/projected/be6afa7c-b5a9-484f-8f55-705241c391dc-kube-api-access-2wk8p\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f932070d-6f40-4017-a1d2-cb205561989e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f932070d-6f40-4017-a1d2-cb205561989e-config\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6afa7c-b5a9-484f-8f55-705241c391dc-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be6afa7c-b5a9-484f-8f55-705241c391dc-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be6afa7c-b5a9-484f-8f55-705241c391dc-config\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f932070d-6f40-4017-a1d2-cb205561989e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.717978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56px\" (UniqueName: \"kubernetes.io/projected/f932070d-6f40-4017-a1d2-cb205561989e-kube-api-access-j56px\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.718810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.719470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f932070d-6f40-4017-a1d2-cb205561989e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.719606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6afa7c-b5a9-484f-8f55-705241c391dc-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.719625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be6afa7c-b5a9-484f-8f55-705241c391dc-config\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.721337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be6afa7c-b5a9-484f-8f55-705241c391dc-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.722081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f932070d-6f40-4017-a1d2-cb205561989e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.723279 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.723324 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/498d8f647f82c1c472d0f58e0ab2664398caf66d6ef545905f78d8316892ad8a/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.725081 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.725280 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b81208002273269e68c51cc28f2b6591e3261deb91a2f71e6604d339270e8ba/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.727506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f932070d-6f40-4017-a1d2-cb205561989e-config\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.727595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.728622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.737629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.738223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.738366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f932070d-6f40-4017-a1d2-cb205561989e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.739212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be6afa7c-b5a9-484f-8f55-705241c391dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.742508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wk8p\" (UniqueName: \"kubernetes.io/projected/be6afa7c-b5a9-484f-8f55-705241c391dc-kube-api-access-2wk8p\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.748528 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56px\" (UniqueName: \"kubernetes.io/projected/f932070d-6f40-4017-a1d2-cb205561989e-kube-api-access-j56px\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.777213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f04ee548-5f85-49fd-aa51-5bb353db5b9b\") pod \"ovsdbserver-nb-1\" (UID: \"f932070d-6f40-4017-a1d2-cb205561989e\") " pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:04 crc kubenswrapper[4749]: I0310 17:18:04.782565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32042422-a790-42a1-b9b3-f3ac3a47ac0d\") pod \"ovsdbserver-nb-2\" (UID: \"be6afa7c-b5a9-484f-8f55-705241c391dc\") " pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.043804 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.070579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.152191 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552712-scffb"] Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.160561 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552712-scffb"] Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.365255 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 17:18:05 crc kubenswrapper[4749]: W0310 17:18:05.369225 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd6f1622_50c3_4a95_80d9_e833ddc6deba.slice/crio-453df18bb49c49aecf1887d95e3298d91921c6100a0f0cd9100b30721946c472 WatchSource:0}: Error finding container 453df18bb49c49aecf1887d95e3298d91921c6100a0f0cd9100b30721946c472: Status 404 returned error can't find the container with id 453df18bb49c49aecf1887d95e3298d91921c6100a0f0cd9100b30721946c472 Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.606611 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:18:05 crc kubenswrapper[4749]: E0310 17:18:05.607026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.615659 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486a1f15-4abd-46a9-9ce1-aa79e0e3cc94" path="/var/lib/kubelet/pods/486a1f15-4abd-46a9-9ce1-aa79e0e3cc94/volumes" Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.647724 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.714464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd6f1622-50c3-4a95-80d9-e833ddc6deba","Type":"ContainerStarted","Data":"d3a862f0b974075b98159b79acc50dcaa0dd7fa823768af98090019c11dde6b7"} Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.714513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd6f1622-50c3-4a95-80d9-e833ddc6deba","Type":"ContainerStarted","Data":"453df18bb49c49aecf1887d95e3298d91921c6100a0f0cd9100b30721946c472"} Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.715965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"be6afa7c-b5a9-484f-8f55-705241c391dc","Type":"ContainerStarted","Data":"36ab788db052b1c905257c31063a8f2bf3f12551d79cf215ec868d6be320617f"} Mar 10 17:18:05 crc kubenswrapper[4749]: I0310 17:18:05.748891 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.333165 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.334913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.338739 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.339124 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7jhb8" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.339365 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.339587 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.350694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.361674 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.362985 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.367450 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.369093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.376156 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.382765 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43a27b8-17bb-4826-9d18-0441ee12086c-config\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449290 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f556053a-b874-43d4-a6e2-a2640c82a2bb-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dr6\" (UniqueName: \"kubernetes.io/projected/9f147f8a-662f-43aa-8698-e98aefaf1f4a-kube-api-access-g6dr6\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43a27b8-17bb-4826-9d18-0441ee12086c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f556053a-b874-43d4-a6e2-a2640c82a2bb-config\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f147f8a-662f-43aa-8698-e98aefaf1f4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a4483794-5236-4800-972a-80ad60ff2423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4483794-5236-4800-972a-80ad60ff2423\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f556053a-b874-43d4-a6e2-a2640c82a2bb-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsw8m\" (UniqueName: \"kubernetes.io/projected/f556053a-b874-43d4-a6e2-a2640c82a2bb-kube-api-access-zsw8m\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.449986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f43a27b8-17bb-4826-9d18-0441ee12086c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6638834c-0b96-4905-b18e-6882be84a65d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6638834c-0b96-4905-b18e-6882be84a65d\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f147f8a-662f-43aa-8698-e98aefaf1f4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv858\" (UniqueName: \"kubernetes.io/projected/f43a27b8-17bb-4826-9d18-0441ee12086c-kube-api-access-rv858\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.450389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f147f8a-662f-43aa-8698-e98aefaf1f4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6638834c-0b96-4905-b18e-6882be84a65d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6638834c-0b96-4905-b18e-6882be84a65d\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f147f8a-662f-43aa-8698-e98aefaf1f4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv858\" (UniqueName: \"kubernetes.io/projected/f43a27b8-17bb-4826-9d18-0441ee12086c-kube-api-access-rv858\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f147f8a-662f-43aa-8698-e98aefaf1f4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43a27b8-17bb-4826-9d18-0441ee12086c-config\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f556053a-b874-43d4-a6e2-a2640c82a2bb-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dr6\" (UniqueName: \"kubernetes.io/projected/9f147f8a-662f-43aa-8698-e98aefaf1f4a-kube-api-access-g6dr6\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43a27b8-17bb-4826-9d18-0441ee12086c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.552944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f556053a-b874-43d4-a6e2-a2640c82a2bb-config\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f147f8a-662f-43aa-8698-e98aefaf1f4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a4483794-5236-4800-972a-80ad60ff2423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4483794-5236-4800-972a-80ad60ff2423\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f556053a-b874-43d4-a6e2-a2640c82a2bb-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553327 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsw8m\" (UniqueName: \"kubernetes.io/projected/f556053a-b874-43d4-a6e2-a2640c82a2bb-kube-api-access-zsw8m\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f43a27b8-17bb-4826-9d18-0441ee12086c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.553612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.554034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f147f8a-662f-43aa-8698-e98aefaf1f4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.554477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43a27b8-17bb-4826-9d18-0441ee12086c-config\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.554754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f556053a-b874-43d4-a6e2-a2640c82a2bb-config\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.554951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f556053a-b874-43d4-a6e2-a2640c82a2bb-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.555233 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f147f8a-662f-43aa-8698-e98aefaf1f4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.555264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f43a27b8-17bb-4826-9d18-0441ee12086c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.554763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f147f8a-662f-43aa-8698-e98aefaf1f4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.555693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f43a27b8-17bb-4826-9d18-0441ee12086c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.556283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f556053a-b874-43d4-a6e2-a2640c82a2bb-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.558913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.559126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.559193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.560574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.561898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.561973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.562092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f147f8a-662f-43aa-8698-e98aefaf1f4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.562708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f43a27b8-17bb-4826-9d18-0441ee12086c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.562720 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.562770 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a4483794-5236-4800-972a-80ad60ff2423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4483794-5236-4800-972a-80ad60ff2423\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/00daae7b7f6e890f33bce7580ca85f3a0fadc9976d170f12ae42df78732d5003/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.563449 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.563552 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89c16b3c89fb54422034a61222e9ad92c1b0587c9f38b6761d94710867d40b3f/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.571212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsw8m\" (UniqueName: \"kubernetes.io/projected/f556053a-b874-43d4-a6e2-a2640c82a2bb-kube-api-access-zsw8m\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.572874 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.573027 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6638834c-0b96-4905-b18e-6882be84a65d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6638834c-0b96-4905-b18e-6882be84a65d\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/25177d26c134fa93c694343a72ceb56b8ab7216c655929ecb21e704f6e191891/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.576110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f556053a-b874-43d4-a6e2-a2640c82a2bb-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.580679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dr6\" (UniqueName: \"kubernetes.io/projected/9f147f8a-662f-43aa-8698-e98aefaf1f4a-kube-api-access-g6dr6\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.582343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv858\" (UniqueName: \"kubernetes.io/projected/f43a27b8-17bb-4826-9d18-0441ee12086c-kube-api-access-rv858\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.603467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a4483794-5236-4800-972a-80ad60ff2423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4483794-5236-4800-972a-80ad60ff2423\") pod \"ovsdbserver-sb-1\" (UID: \"f43a27b8-17bb-4826-9d18-0441ee12086c\") " pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.603467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6638834c-0b96-4905-b18e-6882be84a65d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6638834c-0b96-4905-b18e-6882be84a65d\") pod \"ovsdbserver-sb-0\" (UID: \"9f147f8a-662f-43aa-8698-e98aefaf1f4a\") " pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.603469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7527b7fd-6921-40d5-a651-89d7d9a52b42\") pod \"ovsdbserver-sb-2\" (UID: \"f556053a-b874-43d4-a6e2-a2640c82a2bb\") " pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.652791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.698937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.708419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.739289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd6f1622-50c3-4a95-80d9-e833ddc6deba","Type":"ContainerStarted","Data":"76d9ff75fbbd8592dd6874b2a044b308b2c625e4f918d93f1740c2994120eda5"} Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.755737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"be6afa7c-b5a9-484f-8f55-705241c391dc","Type":"ContainerStarted","Data":"77c23ddaa82bcabf4925c388fd91ea924b62219f418ba2193f5aa9028923dfae"} Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.756237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"be6afa7c-b5a9-484f-8f55-705241c391dc","Type":"ContainerStarted","Data":"12dde53e594eed6e2b6b91048db1be4a7574e9f573140024dfc1464a0b33da7a"} Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.763947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f932070d-6f40-4017-a1d2-cb205561989e","Type":"ContainerStarted","Data":"835cdacfe9527fae800192dfff584e4179da042638d41f0f7ed352f5594c047b"} Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.764011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f932070d-6f40-4017-a1d2-cb205561989e","Type":"ContainerStarted","Data":"0abb16de6fef95ae5d770930a711dff71bf2a735ad9a46b3d28a1307c8cd0ab5"} Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.765100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f932070d-6f40-4017-a1d2-cb205561989e","Type":"ContainerStarted","Data":"e918309150fc1cb436c44548adf6832be5c6fac36594af9c47dfff219cfb8264"} Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.807095 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.807075325 podStartE2EDuration="3.807075325s" podCreationTimestamp="2026-03-10 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:06.802557043 +0000 UTC m=+5383.924422750" watchObservedRunningTime="2026-03-10 17:18:06.807075325 +0000 UTC m=+5383.928941012" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.808911 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.8089032449999998 podStartE2EDuration="3.808903245s" podCreationTimestamp="2026-03-10 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:06.774792086 +0000 UTC m=+5383.896657773" watchObservedRunningTime="2026-03-10 17:18:06.808903245 +0000 UTC m=+5383.930768932" Mar 10 17:18:06 crc kubenswrapper[4749]: I0310 17:18:06.828845 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.828825208 podStartE2EDuration="3.828825208s" podCreationTimestamp="2026-03-10 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:06.823549154 +0000 UTC m=+5383.945414851" watchObservedRunningTime="2026-03-10 17:18:06.828825208 +0000 UTC m=+5383.950690885" Mar 10 17:18:08 crc kubenswrapper[4749]: W0310 17:18:07.229765 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f147f8a_662f_43aa_8698_e98aefaf1f4a.slice/crio-81609af628963b87eee4952a1a31c7a3310fa117af8fdbda1d6953f128278869 WatchSource:0}: Error finding container 81609af628963b87eee4952a1a31c7a3310fa117af8fdbda1d6953f128278869: Status 404 returned error can't find the container with id 81609af628963b87eee4952a1a31c7a3310fa117af8fdbda1d6953f128278869 Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.233434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.355249 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 17:18:08 crc kubenswrapper[4749]: W0310 17:18:07.355683 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf556053a_b874_43d4_a6e2_a2640c82a2bb.slice/crio-164708b658b3ecadbe850430c6946ebc1e2373d2f4ae32ef00eb0a74d1578798 WatchSource:0}: Error finding container 164708b658b3ecadbe850430c6946ebc1e2373d2f4ae32ef00eb0a74d1578798: Status 404 returned error can't find the container with id 164708b658b3ecadbe850430c6946ebc1e2373d2f4ae32ef00eb0a74d1578798 Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.434463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 17:18:08 crc kubenswrapper[4749]: W0310 17:18:07.436231 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf43a27b8_17bb_4826_9d18_0441ee12086c.slice/crio-7dbc4a18fb755862624dd8b84f385e020192fb0d0dd20d293899908459d46269 WatchSource:0}: Error finding container 7dbc4a18fb755862624dd8b84f385e020192fb0d0dd20d293899908459d46269: Status 404 returned error can't find the container with id 7dbc4a18fb755862624dd8b84f385e020192fb0d0dd20d293899908459d46269 Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.721407 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.775734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f147f8a-662f-43aa-8698-e98aefaf1f4a","Type":"ContainerStarted","Data":"1ce764bb8a6c25bddb252b05ad42a62fbfd3d6a525b6ea9db5b1a41c7100ece6"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.775776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f147f8a-662f-43aa-8698-e98aefaf1f4a","Type":"ContainerStarted","Data":"2565239e08fce5c9cc711626cfeeafcdbbeb48103af8a3a5ffaf7157cf6f3a88"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.775785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f147f8a-662f-43aa-8698-e98aefaf1f4a","Type":"ContainerStarted","Data":"81609af628963b87eee4952a1a31c7a3310fa117af8fdbda1d6953f128278869"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.779462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"f556053a-b874-43d4-a6e2-a2640c82a2bb","Type":"ContainerStarted","Data":"236fe4ae682b59c6a2482d79fd5ff22219e36c1aabeb630270a2dc1889e7315c"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.779510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"f556053a-b874-43d4-a6e2-a2640c82a2bb","Type":"ContainerStarted","Data":"f645c77f00078fbac44803585262366fad98d2448c790795ea6c4194218542a5"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.779522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"f556053a-b874-43d4-a6e2-a2640c82a2bb","Type":"ContainerStarted","Data":"164708b658b3ecadbe850430c6946ebc1e2373d2f4ae32ef00eb0a74d1578798"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.781007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f43a27b8-17bb-4826-9d18-0441ee12086c","Type":"ContainerStarted","Data":"68df28662e6b63033e9a7fdd4e344ac3cb4870236e5f9ca73e77c69915d3cbe0"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.781035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f43a27b8-17bb-4826-9d18-0441ee12086c","Type":"ContainerStarted","Data":"7dbc4a18fb755862624dd8b84f385e020192fb0d0dd20d293899908459d46269"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:07.802845 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=2.802829234 podStartE2EDuration="2.802829234s" podCreationTimestamp="2026-03-10 17:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:07.80196901 +0000 UTC m=+5384.923834697" watchObservedRunningTime="2026-03-10 17:18:07.802829234 +0000 UTC m=+5384.924694921" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.045999 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.071481 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.082089 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.111632 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.111611649 podStartE2EDuration="3.111611649s" podCreationTimestamp="2026-03-10 17:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:07.828963386 +0000 UTC m=+5384.950829073" watchObservedRunningTime="2026-03-10 17:18:08.111611649 +0000 UTC m=+5385.233477336" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.114550 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.790770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f43a27b8-17bb-4826-9d18-0441ee12086c","Type":"ContainerStarted","Data":"6add8986c5e841dd9ca5bcfaafb6bb99aca42880b36a21320df55e57e3f91b6e"} Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.791722 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.791761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:08 crc kubenswrapper[4749]: I0310 17:18:08.811759 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.81173583 podStartE2EDuration="3.81173583s" podCreationTimestamp="2026-03-10 17:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:08.805356556 +0000 UTC m=+5385.927222243" watchObservedRunningTime="2026-03-10 17:18:08.81173583 +0000 UTC m=+5385.933601517" Mar 10 17:18:09 crc kubenswrapper[4749]: I0310 17:18:09.653850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:09 crc kubenswrapper[4749]: I0310 17:18:09.699583 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:09 crc kubenswrapper[4749]: I0310 17:18:09.709052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:09 crc kubenswrapper[4749]: I0310 17:18:09.720432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.083279 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.132411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.352427 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-lmhcf"] Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.353695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.358157 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.365099 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-lmhcf"] Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.425233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gsnl\" (UniqueName: \"kubernetes.io/projected/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-kube-api-access-5gsnl\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.425292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-config\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.425358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-dns-svc\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.425399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-ovsdbserver-nb\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.526558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gsnl\" (UniqueName: \"kubernetes.io/projected/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-kube-api-access-5gsnl\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.526796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-config\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.526841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-dns-svc\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.526859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-ovsdbserver-nb\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.527705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-ovsdbserver-nb\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.527710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-config\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.527991 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-dns-svc\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.561849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gsnl\" (UniqueName: \"kubernetes.io/projected/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-kube-api-access-5gsnl\") pod \"dnsmasq-dns-594d96f99f-lmhcf\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.674280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.768904 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:10 crc kubenswrapper[4749]: I0310 17:18:10.856327 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.100507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-lmhcf"] Mar 10 17:18:11 crc kubenswrapper[4749]: W0310 17:18:11.109413 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea24c1b_3ed2_4706_a2c8_870c2f7f6840.slice/crio-3595098506f83be013dac7dac39260b592f60e3067edfd22be79a6c6aafc568f WatchSource:0}: Error finding container 3595098506f83be013dac7dac39260b592f60e3067edfd22be79a6c6aafc568f: Status 404 returned error can't find the container with id 3595098506f83be013dac7dac39260b592f60e3067edfd22be79a6c6aafc568f Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.653524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.699493 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.709416 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.824606 4749 generic.go:334] "Generic (PLEG): container finished" podID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerID="b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986" exitCode=0 Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.825780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" event={"ID":"dea24c1b-3ed2-4706-a2c8-870c2f7f6840","Type":"ContainerDied","Data":"b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986"} Mar 10 17:18:11 crc kubenswrapper[4749]: I0310 17:18:11.825809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" event={"ID":"dea24c1b-3ed2-4706-a2c8-870c2f7f6840","Type":"ContainerStarted","Data":"3595098506f83be013dac7dac39260b592f60e3067edfd22be79a6c6aafc568f"} Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.693808 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.741225 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.746930 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.748132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.805355 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.846700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" event={"ID":"dea24c1b-3ed2-4706-a2c8-870c2f7f6840","Type":"ContainerStarted","Data":"5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323"} Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.848265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.927569 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.959573 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" podStartSLOduration=2.959552494 podStartE2EDuration="2.959552494s" podCreationTimestamp="2026-03-10 17:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:12.90250742 +0000 UTC m=+5390.024373107" watchObservedRunningTime="2026-03-10 17:18:12.959552494 +0000 UTC m=+5390.081418171" Mar 10 17:18:12 crc kubenswrapper[4749]: I0310 17:18:12.979986 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-lmhcf"] Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.017904 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68577db887-pd8q5"] Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.020680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.023448 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.068825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68577db887-pd8q5"] Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.096297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-config\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.096367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-dns-svc\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.096493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-nb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.096558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-sb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.096616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7hgb\" (UniqueName: \"kubernetes.io/projected/50da7d2a-6f11-4f50-b065-10dfc13affc2-kube-api-access-k7hgb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.198991 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-dns-svc\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.199192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-nb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.199227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-sb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.199288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7hgb\" (UniqueName: \"kubernetes.io/projected/50da7d2a-6f11-4f50-b065-10dfc13affc2-kube-api-access-k7hgb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.199316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-config\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.200048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-dns-svc\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.200172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-nb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.200446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-config\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.200738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-sb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.232424 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7hgb\" (UniqueName: \"kubernetes.io/projected/50da7d2a-6f11-4f50-b065-10dfc13affc2-kube-api-access-k7hgb\") pod \"dnsmasq-dns-68577db887-pd8q5\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.344458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.779882 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68577db887-pd8q5"] Mar 10 17:18:13 crc kubenswrapper[4749]: I0310 17:18:13.853732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-pd8q5" event={"ID":"50da7d2a-6f11-4f50-b065-10dfc13affc2","Type":"ContainerStarted","Data":"41972297a963b1bbe48d0f5cb48646ff2b4268b82651785b9eb3900ffaa080f3"} Mar 10 17:18:14 crc kubenswrapper[4749]: I0310 17:18:14.861721 4749 generic.go:334] "Generic (PLEG): container finished" podID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerID="f1b5e3e539ed523d22bdaf861bd70a26f3440aa03b762e8f08283864461abb62" exitCode=0 Mar 10 17:18:14 crc kubenswrapper[4749]: I0310 17:18:14.861838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-pd8q5" event={"ID":"50da7d2a-6f11-4f50-b065-10dfc13affc2","Type":"ContainerDied","Data":"f1b5e3e539ed523d22bdaf861bd70a26f3440aa03b762e8f08283864461abb62"} Mar 10 17:18:14 crc kubenswrapper[4749]: I0310 17:18:14.862180 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerName="dnsmasq-dns" containerID="cri-o://5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323" gracePeriod=10 Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.144048 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.145326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.153281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.155186 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.232415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgshs\" (UniqueName: \"kubernetes.io/projected/0be86646-03b3-476c-8c66-e80ffa63fd7f-kube-api-access-lgshs\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.232826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.232879 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0be86646-03b3-476c-8c66-e80ffa63fd7f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.334707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgshs\" (UniqueName: \"kubernetes.io/projected/0be86646-03b3-476c-8c66-e80ffa63fd7f-kube-api-access-lgshs\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.334808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.334851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0be86646-03b3-476c-8c66-e80ffa63fd7f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.343731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/0be86646-03b3-476c-8c66-e80ffa63fd7f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.352015 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.352072 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a0f40b24591c9f2bd5155b482a533e225bcad662449366244c932ff1651e7d1/globalmount\"" pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.352934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgshs\" (UniqueName: \"kubernetes.io/projected/0be86646-03b3-476c-8c66-e80ffa63fd7f-kube-api-access-lgshs\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.387667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-407fb6b8-72a1-44c8-93d5-a58dc03bed88\") pod \"ovn-copy-data\" (UID: \"0be86646-03b3-476c-8c66-e80ffa63fd7f\") " pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.432899 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.482763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.544045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-dns-svc\") pod \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.544130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gsnl\" (UniqueName: \"kubernetes.io/projected/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-kube-api-access-5gsnl\") pod \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.544235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-ovsdbserver-nb\") pod \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.544360 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-config\") pod \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\" (UID: \"dea24c1b-3ed2-4706-a2c8-870c2f7f6840\") " Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.564298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-kube-api-access-5gsnl" (OuterVolumeSpecName: "kube-api-access-5gsnl") pod "dea24c1b-3ed2-4706-a2c8-870c2f7f6840" (UID: "dea24c1b-3ed2-4706-a2c8-870c2f7f6840"). InnerVolumeSpecName "kube-api-access-5gsnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.600689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-config" (OuterVolumeSpecName: "config") pod "dea24c1b-3ed2-4706-a2c8-870c2f7f6840" (UID: "dea24c1b-3ed2-4706-a2c8-870c2f7f6840"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.612925 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dea24c1b-3ed2-4706-a2c8-870c2f7f6840" (UID: "dea24c1b-3ed2-4706-a2c8-870c2f7f6840"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.613764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dea24c1b-3ed2-4706-a2c8-870c2f7f6840" (UID: "dea24c1b-3ed2-4706-a2c8-870c2f7f6840"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.646414 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.646444 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.646453 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.646463 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gsnl\" (UniqueName: \"kubernetes.io/projected/dea24c1b-3ed2-4706-a2c8-870c2f7f6840-kube-api-access-5gsnl\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.871079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-pd8q5" event={"ID":"50da7d2a-6f11-4f50-b065-10dfc13affc2","Type":"ContainerStarted","Data":"b2ec6e620b2c600ea2c20fb4f7dbef9bf8ce5fa12b10b8e4b7ab80755972b9ef"} Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.872074 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.873512 4749 generic.go:334] "Generic (PLEG): container finished" podID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerID="5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323" exitCode=0 Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.873542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" event={"ID":"dea24c1b-3ed2-4706-a2c8-870c2f7f6840","Type":"ContainerDied","Data":"5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323"} Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.873560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" event={"ID":"dea24c1b-3ed2-4706-a2c8-870c2f7f6840","Type":"ContainerDied","Data":"3595098506f83be013dac7dac39260b592f60e3067edfd22be79a6c6aafc568f"} Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.873580 4749 scope.go:117] "RemoveContainer" containerID="5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.873678 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594d96f99f-lmhcf" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.891341 4749 scope.go:117] "RemoveContainer" containerID="b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.910511 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68577db887-pd8q5" podStartSLOduration=3.91048838 podStartE2EDuration="3.91048838s" podCreationTimestamp="2026-03-10 17:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:15.895264364 +0000 UTC m=+5393.017130051" watchObservedRunningTime="2026-03-10 17:18:15.91048838 +0000 UTC m=+5393.032354067" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.913632 4749 scope.go:117] "RemoveContainer" containerID="5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323" Mar 10 17:18:15 crc kubenswrapper[4749]: E0310 17:18:15.914810 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323\": container with ID starting with 5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323 not found: ID does not exist" containerID="5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.914844 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323"} err="failed to get container status \"5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323\": rpc error: code = NotFound desc = could not find container \"5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323\": container with ID starting with 5db467051b488549403371203e08a9424760d747008783e53802b4d8935b9323 not found: ID does not exist" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.914866 4749 scope.go:117] "RemoveContainer" containerID="b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986" Mar 10 17:18:15 crc kubenswrapper[4749]: E0310 17:18:15.915104 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986\": container with ID starting with b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986 not found: ID does not exist" containerID="b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.915127 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986"} err="failed to get container status \"b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986\": rpc error: code = NotFound desc = could not find container \"b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986\": container with ID starting with b902d36e23b73c64abe473bedb2acce13abcaf5f50dc66cfb00cad7b3e9c3986 not found: ID does not exist" Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.943978 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-lmhcf"] Mar 10 17:18:15 crc kubenswrapper[4749]: I0310 17:18:15.956468 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-594d96f99f-lmhcf"] Mar 10 17:18:16 crc kubenswrapper[4749]: I0310 17:18:16.103685 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 17:18:16 crc kubenswrapper[4749]: W0310 17:18:16.107642 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be86646_03b3_476c_8c66_e80ffa63fd7f.slice/crio-4289fae8f2b1c4122a24d113f4ded6ebbd38e39b986819a993f05689833d9329 WatchSource:0}: Error finding container 4289fae8f2b1c4122a24d113f4ded6ebbd38e39b986819a993f05689833d9329: Status 404 returned error can't find the container with id 4289fae8f2b1c4122a24d113f4ded6ebbd38e39b986819a993f05689833d9329 Mar 10 17:18:16 crc kubenswrapper[4749]: I0310 17:18:16.885955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0be86646-03b3-476c-8c66-e80ffa63fd7f","Type":"ContainerStarted","Data":"4289fae8f2b1c4122a24d113f4ded6ebbd38e39b986819a993f05689833d9329"} Mar 10 17:18:17 crc kubenswrapper[4749]: I0310 17:18:17.607167 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:18:17 crc kubenswrapper[4749]: E0310 17:18:17.607830 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:18:17 crc kubenswrapper[4749]: I0310 17:18:17.617611 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" path="/var/lib/kubelet/pods/dea24c1b-3ed2-4706-a2c8-870c2f7f6840/volumes" Mar 10 17:18:19 crc kubenswrapper[4749]: I0310 17:18:19.933692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"0be86646-03b3-476c-8c66-e80ffa63fd7f","Type":"ContainerStarted","Data":"f66cbed1ba937d36d719253acd1e0929b512f23c488f4f6a19fa5a70c630ab47"} Mar 10 17:18:19 crc kubenswrapper[4749]: I0310 17:18:19.960985 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.15837433 podStartE2EDuration="5.960966822s" podCreationTimestamp="2026-03-10 17:18:14 +0000 UTC" firstStartedPulling="2026-03-10 17:18:16.109803392 +0000 UTC m=+5393.231669079" lastFinishedPulling="2026-03-10 17:18:18.912395884 +0000 UTC m=+5396.034261571" observedRunningTime="2026-03-10 17:18:19.959730438 +0000 UTC m=+5397.081596125" watchObservedRunningTime="2026-03-10 17:18:19.960966822 +0000 UTC m=+5397.082832509" Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.346588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.405076 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-cq62w"] Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.405438 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerName="dnsmasq-dns" containerID="cri-o://0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879" gracePeriod=10 Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.933111 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.963780 4749 generic.go:334] "Generic (PLEG): container finished" podID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerID="0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879" exitCode=0 Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.963837 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.963824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" event={"ID":"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a","Type":"ContainerDied","Data":"0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879"} Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.963987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-cq62w" event={"ID":"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a","Type":"ContainerDied","Data":"4d3e3a489fffeae0a1dfa4eaa965a1e679571f52d862891ef97a7ccc308e40fb"} Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.964021 4749 scope.go:117] "RemoveContainer" containerID="0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879" Mar 10 17:18:23 crc kubenswrapper[4749]: I0310 17:18:23.981750 4749 scope.go:117] "RemoveContainer" containerID="30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.001101 4749 scope.go:117] "RemoveContainer" containerID="0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879" Mar 10 17:18:24 crc kubenswrapper[4749]: E0310 17:18:24.003041 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879\": container with ID starting with 0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879 not found: ID does not exist" containerID="0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.003078 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879"} err="failed to get container status \"0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879\": rpc error: code = NotFound desc = could not find container \"0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879\": container with ID starting with 0546bd27f19ac462616224245e563b5b87d44af898a5d52c230f762a9dcc0879 not found: ID does not exist" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.003102 4749 scope.go:117] "RemoveContainer" containerID="30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1" Mar 10 17:18:24 crc kubenswrapper[4749]: E0310 17:18:24.003517 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1\": container with ID starting with 30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1 not found: ID does not exist" containerID="30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.003575 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1"} err="failed to get container status \"30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1\": rpc error: code = NotFound desc = could not find container \"30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1\": container with ID starting with 30105a8f299f98e4b204912135e1c1a3ef2427ee0f9ee89d9689d2ece0682dc1 not found: ID does not exist" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.042168 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2nnd\" (UniqueName: \"kubernetes.io/projected/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-kube-api-access-x2nnd\") pod \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.042219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-config\") pod \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.042322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-dns-svc\") pod \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\" (UID: \"8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a\") " Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.047120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-kube-api-access-x2nnd" (OuterVolumeSpecName: "kube-api-access-x2nnd") pod "8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" (UID: "8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a"). InnerVolumeSpecName "kube-api-access-x2nnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.083681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-config" (OuterVolumeSpecName: "config") pod "8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" (UID: "8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.090265 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" (UID: "8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.144647 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2nnd\" (UniqueName: \"kubernetes.io/projected/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-kube-api-access-x2nnd\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.144678 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.144687 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.304007 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-cq62w"] Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.311351 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-cq62w"] Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.499814 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 17:18:24 crc kubenswrapper[4749]: E0310 17:18:24.500185 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerName="init" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.500200 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerName="init" Mar 10 17:18:24 crc kubenswrapper[4749]: E0310 17:18:24.500213 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerName="init" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.500221 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerName="init" Mar 10 17:18:24 crc kubenswrapper[4749]: E0310 17:18:24.500249 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerName="dnsmasq-dns" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.500257 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerName="dnsmasq-dns" Mar 10 17:18:24 crc kubenswrapper[4749]: E0310 17:18:24.500272 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerName="dnsmasq-dns" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.500279 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerName="dnsmasq-dns" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.500451 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" containerName="dnsmasq-dns" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.500484 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea24c1b-3ed2-4706-a2c8-870c2f7f6840" containerName="dnsmasq-dns" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.501465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.505471 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.505672 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.505789 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.506186 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5rfnp" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.567320 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.652333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpddp\" (UniqueName: \"kubernetes.io/projected/9ba7986f-4ceb-48e9-9813-e6e856113e7c-kube-api-access-fpddp\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.652709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba7986f-4ceb-48e9-9813-e6e856113e7c-scripts\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.652735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.652835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ba7986f-4ceb-48e9-9813-e6e856113e7c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.652867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ba7986f-4ceb-48e9-9813-e6e856113e7c-config\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.652922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.655614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpddp\" (UniqueName: \"kubernetes.io/projected/9ba7986f-4ceb-48e9-9813-e6e856113e7c-kube-api-access-fpddp\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757610 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba7986f-4ceb-48e9-9813-e6e856113e7c-scripts\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ba7986f-4ceb-48e9-9813-e6e856113e7c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ba7986f-4ceb-48e9-9813-e6e856113e7c-config\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.757705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.759740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ba7986f-4ceb-48e9-9813-e6e856113e7c-config\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.759739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba7986f-4ceb-48e9-9813-e6e856113e7c-scripts\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.766818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.767179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ba7986f-4ceb-48e9-9813-e6e856113e7c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.768439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.773079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba7986f-4ceb-48e9-9813-e6e856113e7c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.783312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpddp\" (UniqueName: \"kubernetes.io/projected/9ba7986f-4ceb-48e9-9813-e6e856113e7c-kube-api-access-fpddp\") pod \"ovn-northd-0\" (UID: \"9ba7986f-4ceb-48e9-9813-e6e856113e7c\") " pod="openstack/ovn-northd-0" Mar 10 17:18:24 crc kubenswrapper[4749]: I0310 17:18:24.854110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 17:18:25 crc kubenswrapper[4749]: I0310 17:18:25.345761 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 17:18:25 crc kubenswrapper[4749]: I0310 17:18:25.616646 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a" path="/var/lib/kubelet/pods/8cb77f2b-7d48-4ed4-9a9a-a6a28a5d071a/volumes" Mar 10 17:18:25 crc kubenswrapper[4749]: I0310 17:18:25.991155 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9ba7986f-4ceb-48e9-9813-e6e856113e7c","Type":"ContainerStarted","Data":"25e5a6a3246081e38b1c6fb4409ca0eb865b20ed26dde643c9b6b5cc03221504"} Mar 10 17:18:25 crc kubenswrapper[4749]: I0310 17:18:25.991232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9ba7986f-4ceb-48e9-9813-e6e856113e7c","Type":"ContainerStarted","Data":"87515f333dbbf1cde0ba11740368e02eec5f970408fc00ce877fa6438aa3e3ef"} Mar 10 17:18:25 crc kubenswrapper[4749]: I0310 17:18:25.991259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9ba7986f-4ceb-48e9-9813-e6e856113e7c","Type":"ContainerStarted","Data":"268a4819019df5763e23e958781ecc7dca9f365215d3ccf0f31e45b488825900"} Mar 10 17:18:26 crc kubenswrapper[4749]: I0310 17:18:26.023114 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.02309514 podStartE2EDuration="2.02309514s" podCreationTimestamp="2026-03-10 17:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:26.021072815 +0000 UTC m=+5403.142938552" watchObservedRunningTime="2026-03-10 17:18:26.02309514 +0000 UTC m=+5403.144960827" Mar 10 17:18:27 crc kubenswrapper[4749]: I0310 17:18:27.000315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 17:18:27 crc kubenswrapper[4749]: I0310 17:18:27.405660 4749 scope.go:117] "RemoveContainer" containerID="4d408357a8dbec6ca56306ddafe1831fb6ef34697be703f74a01c6a2bca06762" Mar 10 17:18:27 crc kubenswrapper[4749]: I0310 17:18:27.432724 4749 scope.go:117] "RemoveContainer" containerID="d59309f9557305938549ce3caa72d348360e893c84dcafc3bc235fb065b736bb" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.537112 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-75qt2"] Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.538752 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.549140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-75qt2"] Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.607429 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:18:29 crc kubenswrapper[4749]: E0310 17:18:29.607884 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.639824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rwqc\" (UniqueName: \"kubernetes.io/projected/50ab8a41-5c56-4242-b3e8-939c13843785-kube-api-access-5rwqc\") pod \"keystone-db-create-75qt2\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.639992 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ab8a41-5c56-4242-b3e8-939c13843785-operator-scripts\") pod \"keystone-db-create-75qt2\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.644504 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8fd6-account-create-update-pgkg2"] Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.645585 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.649346 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.662012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8fd6-account-create-update-pgkg2"] Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.741391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rwqc\" (UniqueName: \"kubernetes.io/projected/50ab8a41-5c56-4242-b3e8-939c13843785-kube-api-access-5rwqc\") pod \"keystone-db-create-75qt2\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.741637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c12163-f467-460b-b482-1af14ef0c774-operator-scripts\") pod \"keystone-8fd6-account-create-update-pgkg2\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.741824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ab8a41-5c56-4242-b3e8-939c13843785-operator-scripts\") pod \"keystone-db-create-75qt2\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.741929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28xcn\" (UniqueName: \"kubernetes.io/projected/70c12163-f467-460b-b482-1af14ef0c774-kube-api-access-28xcn\") pod \"keystone-8fd6-account-create-update-pgkg2\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.744843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ab8a41-5c56-4242-b3e8-939c13843785-operator-scripts\") pod \"keystone-db-create-75qt2\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.765461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rwqc\" (UniqueName: \"kubernetes.io/projected/50ab8a41-5c56-4242-b3e8-939c13843785-kube-api-access-5rwqc\") pod \"keystone-db-create-75qt2\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.843627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c12163-f467-460b-b482-1af14ef0c774-operator-scripts\") pod \"keystone-8fd6-account-create-update-pgkg2\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.844041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28xcn\" (UniqueName: \"kubernetes.io/projected/70c12163-f467-460b-b482-1af14ef0c774-kube-api-access-28xcn\") pod \"keystone-8fd6-account-create-update-pgkg2\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.844733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c12163-f467-460b-b482-1af14ef0c774-operator-scripts\") pod \"keystone-8fd6-account-create-update-pgkg2\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.856203 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.868192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28xcn\" (UniqueName: \"kubernetes.io/projected/70c12163-f467-460b-b482-1af14ef0c774-kube-api-access-28xcn\") pod \"keystone-8fd6-account-create-update-pgkg2\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:29 crc kubenswrapper[4749]: I0310 17:18:29.972585 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:30 crc kubenswrapper[4749]: I0310 17:18:30.337604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-75qt2"] Mar 10 17:18:30 crc kubenswrapper[4749]: W0310 17:18:30.343539 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50ab8a41_5c56_4242_b3e8_939c13843785.slice/crio-a09a3b4a76b8d86628a30a8c1668252105412639ddb636b3d76dbbf012c1ea09 WatchSource:0}: Error finding container a09a3b4a76b8d86628a30a8c1668252105412639ddb636b3d76dbbf012c1ea09: Status 404 returned error can't find the container with id a09a3b4a76b8d86628a30a8c1668252105412639ddb636b3d76dbbf012c1ea09 Mar 10 17:18:30 crc kubenswrapper[4749]: I0310 17:18:30.448855 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8fd6-account-create-update-pgkg2"] Mar 10 17:18:30 crc kubenswrapper[4749]: W0310 17:18:30.452204 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c12163_f467_460b_b482_1af14ef0c774.slice/crio-5dcbf6b829d313f3ba1ed5a29ea24927d07b337fb230ffb5da145f4df3aa53f5 WatchSource:0}: Error finding container 5dcbf6b829d313f3ba1ed5a29ea24927d07b337fb230ffb5da145f4df3aa53f5: Status 404 returned error can't find the container with id 5dcbf6b829d313f3ba1ed5a29ea24927d07b337fb230ffb5da145f4df3aa53f5 Mar 10 17:18:31 crc kubenswrapper[4749]: I0310 17:18:31.033507 4749 generic.go:334] "Generic (PLEG): container finished" podID="70c12163-f467-460b-b482-1af14ef0c774" containerID="80c110d995cfc5810c456e45f1acb114cb472ef51e1e84bdc7a71705eb667ff9" exitCode=0 Mar 10 17:18:31 crc kubenswrapper[4749]: I0310 17:18:31.033588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8fd6-account-create-update-pgkg2" event={"ID":"70c12163-f467-460b-b482-1af14ef0c774","Type":"ContainerDied","Data":"80c110d995cfc5810c456e45f1acb114cb472ef51e1e84bdc7a71705eb667ff9"} Mar 10 17:18:31 crc kubenswrapper[4749]: I0310 17:18:31.033624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8fd6-account-create-update-pgkg2" event={"ID":"70c12163-f467-460b-b482-1af14ef0c774","Type":"ContainerStarted","Data":"5dcbf6b829d313f3ba1ed5a29ea24927d07b337fb230ffb5da145f4df3aa53f5"} Mar 10 17:18:31 crc kubenswrapper[4749]: I0310 17:18:31.036500 4749 generic.go:334] "Generic (PLEG): container finished" podID="50ab8a41-5c56-4242-b3e8-939c13843785" containerID="2def575f5f1e5eb4ed9cba46f7132d042b48cef146c8cb817a22b82039c92e4a" exitCode=0 Mar 10 17:18:31 crc kubenswrapper[4749]: I0310 17:18:31.036552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-75qt2" event={"ID":"50ab8a41-5c56-4242-b3e8-939c13843785","Type":"ContainerDied","Data":"2def575f5f1e5eb4ed9cba46f7132d042b48cef146c8cb817a22b82039c92e4a"} Mar 10 17:18:31 crc kubenswrapper[4749]: I0310 17:18:31.036610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-75qt2" event={"ID":"50ab8a41-5c56-4242-b3e8-939c13843785","Type":"ContainerStarted","Data":"a09a3b4a76b8d86628a30a8c1668252105412639ddb636b3d76dbbf012c1ea09"} Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.482941 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.508135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.595270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28xcn\" (UniqueName: \"kubernetes.io/projected/70c12163-f467-460b-b482-1af14ef0c774-kube-api-access-28xcn\") pod \"70c12163-f467-460b-b482-1af14ef0c774\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.595456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rwqc\" (UniqueName: \"kubernetes.io/projected/50ab8a41-5c56-4242-b3e8-939c13843785-kube-api-access-5rwqc\") pod \"50ab8a41-5c56-4242-b3e8-939c13843785\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.595513 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ab8a41-5c56-4242-b3e8-939c13843785-operator-scripts\") pod \"50ab8a41-5c56-4242-b3e8-939c13843785\" (UID: \"50ab8a41-5c56-4242-b3e8-939c13843785\") " Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.595581 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c12163-f467-460b-b482-1af14ef0c774-operator-scripts\") pod \"70c12163-f467-460b-b482-1af14ef0c774\" (UID: \"70c12163-f467-460b-b482-1af14ef0c774\") " Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.596243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ab8a41-5c56-4242-b3e8-939c13843785-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50ab8a41-5c56-4242-b3e8-939c13843785" (UID: "50ab8a41-5c56-4242-b3e8-939c13843785"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.596351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c12163-f467-460b-b482-1af14ef0c774-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70c12163-f467-460b-b482-1af14ef0c774" (UID: "70c12163-f467-460b-b482-1af14ef0c774"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.599990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ab8a41-5c56-4242-b3e8-939c13843785-kube-api-access-5rwqc" (OuterVolumeSpecName: "kube-api-access-5rwqc") pod "50ab8a41-5c56-4242-b3e8-939c13843785" (UID: "50ab8a41-5c56-4242-b3e8-939c13843785"). InnerVolumeSpecName "kube-api-access-5rwqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.600042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c12163-f467-460b-b482-1af14ef0c774-kube-api-access-28xcn" (OuterVolumeSpecName: "kube-api-access-28xcn") pod "70c12163-f467-460b-b482-1af14ef0c774" (UID: "70c12163-f467-460b-b482-1af14ef0c774"). InnerVolumeSpecName "kube-api-access-28xcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.697872 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28xcn\" (UniqueName: \"kubernetes.io/projected/70c12163-f467-460b-b482-1af14ef0c774-kube-api-access-28xcn\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.698110 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rwqc\" (UniqueName: \"kubernetes.io/projected/50ab8a41-5c56-4242-b3e8-939c13843785-kube-api-access-5rwqc\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.698186 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ab8a41-5c56-4242-b3e8-939c13843785-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:32 crc kubenswrapper[4749]: I0310 17:18:32.698256 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c12163-f467-460b-b482-1af14ef0c774-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:33 crc kubenswrapper[4749]: I0310 17:18:33.054475 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-75qt2" Mar 10 17:18:33 crc kubenswrapper[4749]: I0310 17:18:33.054458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-75qt2" event={"ID":"50ab8a41-5c56-4242-b3e8-939c13843785","Type":"ContainerDied","Data":"a09a3b4a76b8d86628a30a8c1668252105412639ddb636b3d76dbbf012c1ea09"} Mar 10 17:18:33 crc kubenswrapper[4749]: I0310 17:18:33.054600 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09a3b4a76b8d86628a30a8c1668252105412639ddb636b3d76dbbf012c1ea09" Mar 10 17:18:33 crc kubenswrapper[4749]: I0310 17:18:33.055941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8fd6-account-create-update-pgkg2" event={"ID":"70c12163-f467-460b-b482-1af14ef0c774","Type":"ContainerDied","Data":"5dcbf6b829d313f3ba1ed5a29ea24927d07b337fb230ffb5da145f4df3aa53f5"} Mar 10 17:18:33 crc kubenswrapper[4749]: I0310 17:18:33.055992 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dcbf6b829d313f3ba1ed5a29ea24927d07b337fb230ffb5da145f4df3aa53f5" Mar 10 17:18:33 crc kubenswrapper[4749]: I0310 17:18:33.056029 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd6-account-create-update-pgkg2" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.092109 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cdn5f"] Mar 10 17:18:35 crc kubenswrapper[4749]: E0310 17:18:35.094579 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c12163-f467-460b-b482-1af14ef0c774" containerName="mariadb-account-create-update" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.094620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c12163-f467-460b-b482-1af14ef0c774" containerName="mariadb-account-create-update" Mar 10 17:18:35 crc kubenswrapper[4749]: E0310 17:18:35.094641 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ab8a41-5c56-4242-b3e8-939c13843785" containerName="mariadb-database-create" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.094650 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ab8a41-5c56-4242-b3e8-939c13843785" containerName="mariadb-database-create" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.094903 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c12163-f467-460b-b482-1af14ef0c774" containerName="mariadb-account-create-update" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.094930 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ab8a41-5c56-4242-b3e8-939c13843785" containerName="mariadb-database-create" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.095616 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.100514 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.100678 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.100753 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.100796 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msrwg" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.104853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cdn5f"] Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.254876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-config-data\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.254977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlnq9\" (UniqueName: \"kubernetes.io/projected/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-kube-api-access-mlnq9\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.255084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-combined-ca-bundle\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.356498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnq9\" (UniqueName: \"kubernetes.io/projected/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-kube-api-access-mlnq9\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.356701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-combined-ca-bundle\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.356781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-config-data\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.362353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-config-data\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.362841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-combined-ca-bundle\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.373787 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnq9\" (UniqueName: \"kubernetes.io/projected/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-kube-api-access-mlnq9\") pod \"keystone-db-sync-cdn5f\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.424083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:35 crc kubenswrapper[4749]: I0310 17:18:35.847248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cdn5f"] Mar 10 17:18:35 crc kubenswrapper[4749]: W0310 17:18:35.854789 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8658b95_e5ac_4c1f_a48c_1ec864a9df1d.slice/crio-a990c0ca1df169dbb328211bd5eadb66497b0610bed7ea7841b9ea1734a09f8e WatchSource:0}: Error finding container a990c0ca1df169dbb328211bd5eadb66497b0610bed7ea7841b9ea1734a09f8e: Status 404 returned error can't find the container with id a990c0ca1df169dbb328211bd5eadb66497b0610bed7ea7841b9ea1734a09f8e Mar 10 17:18:36 crc kubenswrapper[4749]: I0310 17:18:36.090515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cdn5f" event={"ID":"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d","Type":"ContainerStarted","Data":"c850ebc80ad511fe6a829e00eb26120d0b1d70681267915b1f77ce8468fafc5d"} Mar 10 17:18:36 crc kubenswrapper[4749]: I0310 17:18:36.090811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cdn5f" event={"ID":"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d","Type":"ContainerStarted","Data":"a990c0ca1df169dbb328211bd5eadb66497b0610bed7ea7841b9ea1734a09f8e"} Mar 10 17:18:36 crc kubenswrapper[4749]: I0310 17:18:36.111437 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cdn5f" podStartSLOduration=1.111412438 podStartE2EDuration="1.111412438s" podCreationTimestamp="2026-03-10 17:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:36.106232197 +0000 UTC m=+5413.228097884" watchObservedRunningTime="2026-03-10 17:18:36.111412438 +0000 UTC m=+5413.233278135" Mar 10 17:18:38 crc kubenswrapper[4749]: I0310 17:18:38.120871 4749 generic.go:334] "Generic (PLEG): container finished" podID="f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" containerID="c850ebc80ad511fe6a829e00eb26120d0b1d70681267915b1f77ce8468fafc5d" exitCode=0 Mar 10 17:18:38 crc kubenswrapper[4749]: I0310 17:18:38.121456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cdn5f" event={"ID":"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d","Type":"ContainerDied","Data":"c850ebc80ad511fe6a829e00eb26120d0b1d70681267915b1f77ce8468fafc5d"} Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.520217 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.623251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-config-data\") pod \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.623389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-combined-ca-bundle\") pod \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.623442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlnq9\" (UniqueName: \"kubernetes.io/projected/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-kube-api-access-mlnq9\") pod \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\" (UID: \"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d\") " Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.630004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-kube-api-access-mlnq9" (OuterVolumeSpecName: "kube-api-access-mlnq9") pod "f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" (UID: "f8658b95-e5ac-4c1f-a48c-1ec864a9df1d"). InnerVolumeSpecName "kube-api-access-mlnq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.657699 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" (UID: "f8658b95-e5ac-4c1f-a48c-1ec864a9df1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.671612 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-config-data" (OuterVolumeSpecName: "config-data") pod "f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" (UID: "f8658b95-e5ac-4c1f-a48c-1ec864a9df1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.725968 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlnq9\" (UniqueName: \"kubernetes.io/projected/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-kube-api-access-mlnq9\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.726009 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:39 crc kubenswrapper[4749]: I0310 17:18:39.726024 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.138130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cdn5f" event={"ID":"f8658b95-e5ac-4c1f-a48c-1ec864a9df1d","Type":"ContainerDied","Data":"a990c0ca1df169dbb328211bd5eadb66497b0610bed7ea7841b9ea1734a09f8e"} Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.138418 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a990c0ca1df169dbb328211bd5eadb66497b0610bed7ea7841b9ea1734a09f8e" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.138192 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cdn5f" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.475727 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g59ht"] Mar 10 17:18:40 crc kubenswrapper[4749]: E0310 17:18:40.476320 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" containerName="keystone-db-sync" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.476445 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" containerName="keystone-db-sync" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.476714 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" containerName="keystone-db-sync" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.477303 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.482594 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.482594 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.482778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.482953 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msrwg" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.483068 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.506600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g59ht"] Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.538329 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5956757-8smgt"] Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.539751 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.556472 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5956757-8smgt"] Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.607183 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:18:40 crc kubenswrapper[4749]: E0310 17:18:40.607421 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.642611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-combined-ca-bundle\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.642805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-ovsdbserver-sb\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.642842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-credential-keys\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.642916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svgzs\" (UniqueName: \"kubernetes.io/projected/c80035b9-31e6-4623-9c53-3519f48b937c-kube-api-access-svgzs\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.642965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-config\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.643037 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flh4\" (UniqueName: \"kubernetes.io/projected/b19c31c1-43bb-4296-95db-d1717831b240-kube-api-access-7flh4\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.643062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-dns-svc\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.643112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-fernet-keys\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.643337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-scripts\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.643426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-config-data\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.643525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-ovsdbserver-nb\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.744742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-ovsdbserver-sb\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.744791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-credential-keys\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.744842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svgzs\" (UniqueName: \"kubernetes.io/projected/c80035b9-31e6-4623-9c53-3519f48b937c-kube-api-access-svgzs\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.744861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-config\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.744942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flh4\" (UniqueName: \"kubernetes.io/projected/b19c31c1-43bb-4296-95db-d1717831b240-kube-api-access-7flh4\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.744962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-dns-svc\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.745016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-fernet-keys\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.745058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-scripts\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.745082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-config-data\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.745139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-ovsdbserver-nb\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.745206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-combined-ca-bundle\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.745549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-ovsdbserver-sb\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.746532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-config\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.746710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-ovsdbserver-nb\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.747020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c80035b9-31e6-4623-9c53-3519f48b937c-dns-svc\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.750322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-scripts\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.750571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-credential-keys\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.754008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-fernet-keys\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.767627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-config-data\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.771980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svgzs\" (UniqueName: \"kubernetes.io/projected/c80035b9-31e6-4623-9c53-3519f48b937c-kube-api-access-svgzs\") pod \"dnsmasq-dns-66d5956757-8smgt\" (UID: \"c80035b9-31e6-4623-9c53-3519f48b937c\") " pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.774071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-combined-ca-bundle\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.775951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flh4\" (UniqueName: \"kubernetes.io/projected/b19c31c1-43bb-4296-95db-d1717831b240-kube-api-access-7flh4\") pod \"keystone-bootstrap-g59ht\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.794629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:40 crc kubenswrapper[4749]: I0310 17:18:40.863505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:41 crc kubenswrapper[4749]: I0310 17:18:41.242974 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g59ht"] Mar 10 17:18:41 crc kubenswrapper[4749]: W0310 17:18:41.250096 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb19c31c1_43bb_4296_95db_d1717831b240.slice/crio-cc3ee829b7dae151356d911a33c4c95b4325d4beffe244362a9fb18383858dcc WatchSource:0}: Error finding container cc3ee829b7dae151356d911a33c4c95b4325d4beffe244362a9fb18383858dcc: Status 404 returned error can't find the container with id cc3ee829b7dae151356d911a33c4c95b4325d4beffe244362a9fb18383858dcc Mar 10 17:18:41 crc kubenswrapper[4749]: I0310 17:18:41.359246 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5956757-8smgt"] Mar 10 17:18:41 crc kubenswrapper[4749]: W0310 17:18:41.360936 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc80035b9_31e6_4623_9c53_3519f48b937c.slice/crio-0356efe5e59245fa6cdf9c7ba17a80344a58f74573790b2f85512e5801d159e5 WatchSource:0}: Error finding container 0356efe5e59245fa6cdf9c7ba17a80344a58f74573790b2f85512e5801d159e5: Status 404 returned error can't find the container with id 0356efe5e59245fa6cdf9c7ba17a80344a58f74573790b2f85512e5801d159e5 Mar 10 17:18:42 crc kubenswrapper[4749]: I0310 17:18:42.154690 4749 generic.go:334] "Generic (PLEG): container finished" podID="c80035b9-31e6-4623-9c53-3519f48b937c" containerID="0924b7db3e8fe0d8ff5f47b5d5da44de858a4f3a4c3a733a6761126d6c7c8c66" exitCode=0 Mar 10 17:18:42 crc kubenswrapper[4749]: I0310 17:18:42.154731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5956757-8smgt" event={"ID":"c80035b9-31e6-4623-9c53-3519f48b937c","Type":"ContainerDied","Data":"0924b7db3e8fe0d8ff5f47b5d5da44de858a4f3a4c3a733a6761126d6c7c8c66"} Mar 10 17:18:42 crc kubenswrapper[4749]: I0310 17:18:42.155041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5956757-8smgt" event={"ID":"c80035b9-31e6-4623-9c53-3519f48b937c","Type":"ContainerStarted","Data":"0356efe5e59245fa6cdf9c7ba17a80344a58f74573790b2f85512e5801d159e5"} Mar 10 17:18:42 crc kubenswrapper[4749]: I0310 17:18:42.161089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g59ht" event={"ID":"b19c31c1-43bb-4296-95db-d1717831b240","Type":"ContainerStarted","Data":"0098d06c430536f716c2cdfcc19d3454b02b2b9cd3054fdd369bd6c9085c27ea"} Mar 10 17:18:42 crc kubenswrapper[4749]: I0310 17:18:42.161149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g59ht" event={"ID":"b19c31c1-43bb-4296-95db-d1717831b240","Type":"ContainerStarted","Data":"cc3ee829b7dae151356d911a33c4c95b4325d4beffe244362a9fb18383858dcc"} Mar 10 17:18:42 crc kubenswrapper[4749]: I0310 17:18:42.212052 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g59ht" podStartSLOduration=2.212033016 podStartE2EDuration="2.212033016s" podCreationTimestamp="2026-03-10 17:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:42.208974933 +0000 UTC m=+5419.330840630" watchObservedRunningTime="2026-03-10 17:18:42.212033016 +0000 UTC m=+5419.333898703" Mar 10 17:18:43 crc kubenswrapper[4749]: I0310 17:18:43.169697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5956757-8smgt" event={"ID":"c80035b9-31e6-4623-9c53-3519f48b937c","Type":"ContainerStarted","Data":"e66a36cb809d90b486b2563a4697b132398e28ee2c1c47f56b050c18072c185a"} Mar 10 17:18:43 crc kubenswrapper[4749]: I0310 17:18:43.192109 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5956757-8smgt" podStartSLOduration=3.192094235 podStartE2EDuration="3.192094235s" podCreationTimestamp="2026-03-10 17:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:43.190446611 +0000 UTC m=+5420.312312318" watchObservedRunningTime="2026-03-10 17:18:43.192094235 +0000 UTC m=+5420.313959922" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.175856 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.214673 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfsj"] Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.216235 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.227903 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfsj"] Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.300177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-utilities\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.300268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-catalog-content\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.300480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7rw\" (UniqueName: \"kubernetes.io/projected/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-kube-api-access-qf7rw\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.402537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-utilities\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.402988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-utilities\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.403119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-catalog-content\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.403394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-catalog-content\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.403459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7rw\" (UniqueName: \"kubernetes.io/projected/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-kube-api-access-qf7rw\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.424970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7rw\" (UniqueName: \"kubernetes.io/projected/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-kube-api-access-qf7rw\") pod \"redhat-marketplace-wpfsj\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.535072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:44 crc kubenswrapper[4749]: I0310 17:18:44.914286 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 17:18:45 crc kubenswrapper[4749]: I0310 17:18:45.006534 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfsj"] Mar 10 17:18:45 crc kubenswrapper[4749]: I0310 17:18:45.183832 4749 generic.go:334] "Generic (PLEG): container finished" podID="b19c31c1-43bb-4296-95db-d1717831b240" containerID="0098d06c430536f716c2cdfcc19d3454b02b2b9cd3054fdd369bd6c9085c27ea" exitCode=0 Mar 10 17:18:45 crc kubenswrapper[4749]: I0310 17:18:45.183892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g59ht" event={"ID":"b19c31c1-43bb-4296-95db-d1717831b240","Type":"ContainerDied","Data":"0098d06c430536f716c2cdfcc19d3454b02b2b9cd3054fdd369bd6c9085c27ea"} Mar 10 17:18:45 crc kubenswrapper[4749]: I0310 17:18:45.186330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfsj" event={"ID":"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561","Type":"ContainerStarted","Data":"53bac7efb0b552ea281975c1b652687292e0e5440034a2ba5669edf53f7196a7"} Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.196450 4749 generic.go:334] "Generic (PLEG): container finished" podID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerID="df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5" exitCode=0 Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.196525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfsj" event={"ID":"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561","Type":"ContainerDied","Data":"df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5"} Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.560211 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.657877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-credential-keys\") pod \"b19c31c1-43bb-4296-95db-d1717831b240\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.657968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-config-data\") pod \"b19c31c1-43bb-4296-95db-d1717831b240\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.658032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-scripts\") pod \"b19c31c1-43bb-4296-95db-d1717831b240\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.658148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7flh4\" (UniqueName: \"kubernetes.io/projected/b19c31c1-43bb-4296-95db-d1717831b240-kube-api-access-7flh4\") pod \"b19c31c1-43bb-4296-95db-d1717831b240\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.658187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-fernet-keys\") pod \"b19c31c1-43bb-4296-95db-d1717831b240\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.659130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-combined-ca-bundle\") pod \"b19c31c1-43bb-4296-95db-d1717831b240\" (UID: \"b19c31c1-43bb-4296-95db-d1717831b240\") " Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.665366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-scripts" (OuterVolumeSpecName: "scripts") pod "b19c31c1-43bb-4296-95db-d1717831b240" (UID: "b19c31c1-43bb-4296-95db-d1717831b240"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.667298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b19c31c1-43bb-4296-95db-d1717831b240" (UID: "b19c31c1-43bb-4296-95db-d1717831b240"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.668014 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b19c31c1-43bb-4296-95db-d1717831b240" (UID: "b19c31c1-43bb-4296-95db-d1717831b240"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.684138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19c31c1-43bb-4296-95db-d1717831b240-kube-api-access-7flh4" (OuterVolumeSpecName: "kube-api-access-7flh4") pod "b19c31c1-43bb-4296-95db-d1717831b240" (UID: "b19c31c1-43bb-4296-95db-d1717831b240"). InnerVolumeSpecName "kube-api-access-7flh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.707930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b19c31c1-43bb-4296-95db-d1717831b240" (UID: "b19c31c1-43bb-4296-95db-d1717831b240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.722045 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-config-data" (OuterVolumeSpecName: "config-data") pod "b19c31c1-43bb-4296-95db-d1717831b240" (UID: "b19c31c1-43bb-4296-95db-d1717831b240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.761712 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.761749 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.761760 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.761771 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7flh4\" (UniqueName: \"kubernetes.io/projected/b19c31c1-43bb-4296-95db-d1717831b240-kube-api-access-7flh4\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.761784 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:46 crc kubenswrapper[4749]: I0310 17:18:46.761795 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19c31c1-43bb-4296-95db-d1717831b240-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.207708 4749 generic.go:334] "Generic (PLEG): container finished" podID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerID="a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420" exitCode=0 Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.207827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfsj" event={"ID":"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561","Type":"ContainerDied","Data":"a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420"} Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.211961 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g59ht" event={"ID":"b19c31c1-43bb-4296-95db-d1717831b240","Type":"ContainerDied","Data":"cc3ee829b7dae151356d911a33c4c95b4325d4beffe244362a9fb18383858dcc"} Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.211990 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc3ee829b7dae151356d911a33c4c95b4325d4beffe244362a9fb18383858dcc" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.212044 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g59ht" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.284728 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g59ht"] Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.292908 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g59ht"] Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.365367 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jfqpd"] Mar 10 17:18:47 crc kubenswrapper[4749]: E0310 17:18:47.365775 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19c31c1-43bb-4296-95db-d1717831b240" containerName="keystone-bootstrap" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.365791 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19c31c1-43bb-4296-95db-d1717831b240" containerName="keystone-bootstrap" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.365953 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19c31c1-43bb-4296-95db-d1717831b240" containerName="keystone-bootstrap" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.366494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.372930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msrwg" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.373006 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.373951 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.378898 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.382798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.391839 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jfqpd"] Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.476810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-config-data\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.476858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-combined-ca-bundle\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.476897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-fernet-keys\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.476921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-credential-keys\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.476956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z55gv\" (UniqueName: \"kubernetes.io/projected/249a5269-fc54-4bee-98a8-de5b4bad612b-kube-api-access-z55gv\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.476977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-scripts\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.578731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-config-data\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.578789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-combined-ca-bundle\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.578825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-fernet-keys\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.578844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-credential-keys\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.578883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z55gv\" (UniqueName: \"kubernetes.io/projected/249a5269-fc54-4bee-98a8-de5b4bad612b-kube-api-access-z55gv\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.578910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-scripts\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.584252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-credential-keys\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.584597 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-scripts\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.585151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-combined-ca-bundle\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.585947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-fernet-keys\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.587067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-config-data\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.595524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z55gv\" (UniqueName: \"kubernetes.io/projected/249a5269-fc54-4bee-98a8-de5b4bad612b-kube-api-access-z55gv\") pod \"keystone-bootstrap-jfqpd\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.623565 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19c31c1-43bb-4296-95db-d1717831b240" path="/var/lib/kubelet/pods/b19c31c1-43bb-4296-95db-d1717831b240/volumes" Mar 10 17:18:47 crc kubenswrapper[4749]: I0310 17:18:47.687456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:48 crc kubenswrapper[4749]: I0310 17:18:48.144671 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jfqpd"] Mar 10 17:18:48 crc kubenswrapper[4749]: I0310 17:18:48.221615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jfqpd" event={"ID":"249a5269-fc54-4bee-98a8-de5b4bad612b","Type":"ContainerStarted","Data":"5f186ea32b81f1c077340d77ad1c364e4a97c83b1b5a258678059bcabe1f2108"} Mar 10 17:18:48 crc kubenswrapper[4749]: I0310 17:18:48.224092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfsj" event={"ID":"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561","Type":"ContainerStarted","Data":"fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd"} Mar 10 17:18:48 crc kubenswrapper[4749]: I0310 17:18:48.251271 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpfsj" podStartSLOduration=2.812499335 podStartE2EDuration="4.251250457s" podCreationTimestamp="2026-03-10 17:18:44 +0000 UTC" firstStartedPulling="2026-03-10 17:18:46.197715521 +0000 UTC m=+5423.319581208" lastFinishedPulling="2026-03-10 17:18:47.636466643 +0000 UTC m=+5424.758332330" observedRunningTime="2026-03-10 17:18:48.249258733 +0000 UTC m=+5425.371124430" watchObservedRunningTime="2026-03-10 17:18:48.251250457 +0000 UTC m=+5425.373116154" Mar 10 17:18:49 crc kubenswrapper[4749]: I0310 17:18:49.236177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jfqpd" event={"ID":"249a5269-fc54-4bee-98a8-de5b4bad612b","Type":"ContainerStarted","Data":"763b57c76dbfb7c1c2e1347871a97f851eeaf031773277649124dbc8438ca660"} Mar 10 17:18:49 crc kubenswrapper[4749]: I0310 17:18:49.252826 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jfqpd" podStartSLOduration=2.2528074240000002 podStartE2EDuration="2.252807424s" podCreationTimestamp="2026-03-10 17:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:49.249839103 +0000 UTC m=+5426.371704790" watchObservedRunningTime="2026-03-10 17:18:49.252807424 +0000 UTC m=+5426.374673101" Mar 10 17:18:50 crc kubenswrapper[4749]: I0310 17:18:50.864604 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5956757-8smgt" Mar 10 17:18:50 crc kubenswrapper[4749]: I0310 17:18:50.953580 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68577db887-pd8q5"] Mar 10 17:18:50 crc kubenswrapper[4749]: I0310 17:18:50.954070 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68577db887-pd8q5" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerName="dnsmasq-dns" containerID="cri-o://b2ec6e620b2c600ea2c20fb4f7dbef9bf8ce5fa12b10b8e4b7ab80755972b9ef" gracePeriod=10 Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.261946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jfqpd" event={"ID":"249a5269-fc54-4bee-98a8-de5b4bad612b","Type":"ContainerDied","Data":"763b57c76dbfb7c1c2e1347871a97f851eeaf031773277649124dbc8438ca660"} Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.261952 4749 generic.go:334] "Generic (PLEG): container finished" podID="249a5269-fc54-4bee-98a8-de5b4bad612b" containerID="763b57c76dbfb7c1c2e1347871a97f851eeaf031773277649124dbc8438ca660" exitCode=0 Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.272128 4749 generic.go:334] "Generic (PLEG): container finished" podID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerID="b2ec6e620b2c600ea2c20fb4f7dbef9bf8ce5fa12b10b8e4b7ab80755972b9ef" exitCode=0 Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.272174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-pd8q5" event={"ID":"50da7d2a-6f11-4f50-b065-10dfc13affc2","Type":"ContainerDied","Data":"b2ec6e620b2c600ea2c20fb4f7dbef9bf8ce5fa12b10b8e4b7ab80755972b9ef"} Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.470530 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.562666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-sb\") pod \"50da7d2a-6f11-4f50-b065-10dfc13affc2\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.562739 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7hgb\" (UniqueName: \"kubernetes.io/projected/50da7d2a-6f11-4f50-b065-10dfc13affc2-kube-api-access-k7hgb\") pod \"50da7d2a-6f11-4f50-b065-10dfc13affc2\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.562803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-config\") pod \"50da7d2a-6f11-4f50-b065-10dfc13affc2\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.562829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-nb\") pod \"50da7d2a-6f11-4f50-b065-10dfc13affc2\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.562854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-dns-svc\") pod \"50da7d2a-6f11-4f50-b065-10dfc13affc2\" (UID: \"50da7d2a-6f11-4f50-b065-10dfc13affc2\") " Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.568601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50da7d2a-6f11-4f50-b065-10dfc13affc2-kube-api-access-k7hgb" (OuterVolumeSpecName: "kube-api-access-k7hgb") pod "50da7d2a-6f11-4f50-b065-10dfc13affc2" (UID: "50da7d2a-6f11-4f50-b065-10dfc13affc2"). InnerVolumeSpecName "kube-api-access-k7hgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.610958 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-config" (OuterVolumeSpecName: "config") pod "50da7d2a-6f11-4f50-b065-10dfc13affc2" (UID: "50da7d2a-6f11-4f50-b065-10dfc13affc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.613592 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50da7d2a-6f11-4f50-b065-10dfc13affc2" (UID: "50da7d2a-6f11-4f50-b065-10dfc13affc2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.621292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50da7d2a-6f11-4f50-b065-10dfc13affc2" (UID: "50da7d2a-6f11-4f50-b065-10dfc13affc2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.621959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50da7d2a-6f11-4f50-b065-10dfc13affc2" (UID: "50da7d2a-6f11-4f50-b065-10dfc13affc2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.666444 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.666478 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.666486 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.666497 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7hgb\" (UniqueName: \"kubernetes.io/projected/50da7d2a-6f11-4f50-b065-10dfc13affc2-kube-api-access-k7hgb\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:51 crc kubenswrapper[4749]: I0310 17:18:51.666508 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50da7d2a-6f11-4f50-b065-10dfc13affc2-config\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.281529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68577db887-pd8q5" event={"ID":"50da7d2a-6f11-4f50-b065-10dfc13affc2","Type":"ContainerDied","Data":"41972297a963b1bbe48d0f5cb48646ff2b4268b82651785b9eb3900ffaa080f3"} Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.281595 4749 scope.go:117] "RemoveContainer" containerID="b2ec6e620b2c600ea2c20fb4f7dbef9bf8ce5fa12b10b8e4b7ab80755972b9ef" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.282819 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68577db887-pd8q5" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.306111 4749 scope.go:117] "RemoveContainer" containerID="f1b5e3e539ed523d22bdaf861bd70a26f3440aa03b762e8f08283864461abb62" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.332297 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68577db887-pd8q5"] Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.343900 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68577db887-pd8q5"] Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.640425 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.782546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-config-data\") pod \"249a5269-fc54-4bee-98a8-de5b4bad612b\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.782600 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-combined-ca-bundle\") pod \"249a5269-fc54-4bee-98a8-de5b4bad612b\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.782707 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-credential-keys\") pod \"249a5269-fc54-4bee-98a8-de5b4bad612b\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.782797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z55gv\" (UniqueName: \"kubernetes.io/projected/249a5269-fc54-4bee-98a8-de5b4bad612b-kube-api-access-z55gv\") pod \"249a5269-fc54-4bee-98a8-de5b4bad612b\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.782901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-fernet-keys\") pod \"249a5269-fc54-4bee-98a8-de5b4bad612b\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.782924 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-scripts\") pod \"249a5269-fc54-4bee-98a8-de5b4bad612b\" (UID: \"249a5269-fc54-4bee-98a8-de5b4bad612b\") " Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.787884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-scripts" (OuterVolumeSpecName: "scripts") pod "249a5269-fc54-4bee-98a8-de5b4bad612b" (UID: "249a5269-fc54-4bee-98a8-de5b4bad612b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.787948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249a5269-fc54-4bee-98a8-de5b4bad612b-kube-api-access-z55gv" (OuterVolumeSpecName: "kube-api-access-z55gv") pod "249a5269-fc54-4bee-98a8-de5b4bad612b" (UID: "249a5269-fc54-4bee-98a8-de5b4bad612b"). InnerVolumeSpecName "kube-api-access-z55gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.787982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "249a5269-fc54-4bee-98a8-de5b4bad612b" (UID: "249a5269-fc54-4bee-98a8-de5b4bad612b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.789360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "249a5269-fc54-4bee-98a8-de5b4bad612b" (UID: "249a5269-fc54-4bee-98a8-de5b4bad612b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.804217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-config-data" (OuterVolumeSpecName: "config-data") pod "249a5269-fc54-4bee-98a8-de5b4bad612b" (UID: "249a5269-fc54-4bee-98a8-de5b4bad612b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.807542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "249a5269-fc54-4bee-98a8-de5b4bad612b" (UID: "249a5269-fc54-4bee-98a8-de5b4bad612b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.885411 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.885449 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.885484 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.885495 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z55gv\" (UniqueName: \"kubernetes.io/projected/249a5269-fc54-4bee-98a8-de5b4bad612b-kube-api-access-z55gv\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.885504 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:52 crc kubenswrapper[4749]: I0310 17:18:52.885512 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249a5269-fc54-4bee-98a8-de5b4bad612b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.292343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jfqpd" event={"ID":"249a5269-fc54-4bee-98a8-de5b4bad612b","Type":"ContainerDied","Data":"5f186ea32b81f1c077340d77ad1c364e4a97c83b1b5a258678059bcabe1f2108"} Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.292680 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f186ea32b81f1c077340d77ad1c364e4a97c83b1b5a258678059bcabe1f2108" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.292470 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jfqpd" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.370420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-75498b5f58-p5jp4"] Mar 10 17:18:53 crc kubenswrapper[4749]: E0310 17:18:53.370763 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerName="dnsmasq-dns" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.370789 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerName="dnsmasq-dns" Mar 10 17:18:53 crc kubenswrapper[4749]: E0310 17:18:53.370815 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249a5269-fc54-4bee-98a8-de5b4bad612b" containerName="keystone-bootstrap" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.370825 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="249a5269-fc54-4bee-98a8-de5b4bad612b" containerName="keystone-bootstrap" Mar 10 17:18:53 crc kubenswrapper[4749]: E0310 17:18:53.370853 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerName="init" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.370865 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerName="init" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.371052 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" containerName="dnsmasq-dns" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.371074 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="249a5269-fc54-4bee-98a8-de5b4bad612b" containerName="keystone-bootstrap" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.371629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.374637 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.374994 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.375212 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msrwg" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.375426 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.377198 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.377954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.382902 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75498b5f58-p5jp4"] Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-credential-keys\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-combined-ca-bundle\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-fernet-keys\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmt5\" (UniqueName: \"kubernetes.io/projected/1ed6be0f-634b-450a-b7fb-f2b679793b46-kube-api-access-tdmt5\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-internal-tls-certs\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-config-data\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-scripts\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.502473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-public-tls-certs\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.603662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-public-tls-certs\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.603732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-credential-keys\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.603761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-combined-ca-bundle\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.603792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-fernet-keys\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.603936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmt5\" (UniqueName: \"kubernetes.io/projected/1ed6be0f-634b-450a-b7fb-f2b679793b46-kube-api-access-tdmt5\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.604082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-internal-tls-certs\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.604129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-config-data\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.604164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-scripts\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.607876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-public-tls-certs\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.608462 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-scripts\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.609947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-internal-tls-certs\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.613625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-fernet-keys\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.614561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-config-data\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.617314 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:18:53 crc kubenswrapper[4749]: E0310 17:18:53.617690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.623019 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-credential-keys\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.623442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6be0f-634b-450a-b7fb-f2b679793b46-combined-ca-bundle\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.624272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmt5\" (UniqueName: \"kubernetes.io/projected/1ed6be0f-634b-450a-b7fb-f2b679793b46-kube-api-access-tdmt5\") pod \"keystone-75498b5f58-p5jp4\" (UID: \"1ed6be0f-634b-450a-b7fb-f2b679793b46\") " pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.632143 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50da7d2a-6f11-4f50-b065-10dfc13affc2" path="/var/lib/kubelet/pods/50da7d2a-6f11-4f50-b065-10dfc13affc2/volumes" Mar 10 17:18:53 crc kubenswrapper[4749]: I0310 17:18:53.703925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:54 crc kubenswrapper[4749]: I0310 17:18:54.182403 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-75498b5f58-p5jp4"] Mar 10 17:18:54 crc kubenswrapper[4749]: W0310 17:18:54.189506 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ed6be0f_634b_450a_b7fb_f2b679793b46.slice/crio-a2ce88780355e810510db7b7dd840cae451c95967d68e4896af4047f19acf884 WatchSource:0}: Error finding container a2ce88780355e810510db7b7dd840cae451c95967d68e4896af4047f19acf884: Status 404 returned error can't find the container with id a2ce88780355e810510db7b7dd840cae451c95967d68e4896af4047f19acf884 Mar 10 17:18:54 crc kubenswrapper[4749]: I0310 17:18:54.301155 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75498b5f58-p5jp4" event={"ID":"1ed6be0f-634b-450a-b7fb-f2b679793b46","Type":"ContainerStarted","Data":"a2ce88780355e810510db7b7dd840cae451c95967d68e4896af4047f19acf884"} Mar 10 17:18:54 crc kubenswrapper[4749]: I0310 17:18:54.535149 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:54 crc kubenswrapper[4749]: I0310 17:18:54.535456 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:54 crc kubenswrapper[4749]: I0310 17:18:54.580035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:55 crc kubenswrapper[4749]: I0310 17:18:55.310857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-75498b5f58-p5jp4" event={"ID":"1ed6be0f-634b-450a-b7fb-f2b679793b46","Type":"ContainerStarted","Data":"953d0cea94dfa2080f265342fa6278c7a74b5f27f38110b55ecc71d3aff9f316"} Mar 10 17:18:55 crc kubenswrapper[4749]: I0310 17:18:55.334861 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-75498b5f58-p5jp4" podStartSLOduration=2.3348322440000002 podStartE2EDuration="2.334832244s" podCreationTimestamp="2026-03-10 17:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:18:55.328776309 +0000 UTC m=+5432.450641996" watchObservedRunningTime="2026-03-10 17:18:55.334832244 +0000 UTC m=+5432.456697941" Mar 10 17:18:55 crc kubenswrapper[4749]: I0310 17:18:55.372616 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:55 crc kubenswrapper[4749]: I0310 17:18:55.423788 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfsj"] Mar 10 17:18:56 crc kubenswrapper[4749]: I0310 17:18:56.317350 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:18:57 crc kubenswrapper[4749]: I0310 17:18:57.326504 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpfsj" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="registry-server" containerID="cri-o://fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd" gracePeriod=2 Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.248800 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.334904 4749 generic.go:334] "Generic (PLEG): container finished" podID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerID="fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd" exitCode=0 Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.334961 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfsj" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.334965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfsj" event={"ID":"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561","Type":"ContainerDied","Data":"fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd"} Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.335050 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfsj" event={"ID":"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561","Type":"ContainerDied","Data":"53bac7efb0b552ea281975c1b652687292e0e5440034a2ba5669edf53f7196a7"} Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.335072 4749 scope.go:117] "RemoveContainer" containerID="fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.354553 4749 scope.go:117] "RemoveContainer" containerID="a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.379076 4749 scope.go:117] "RemoveContainer" containerID="df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.382095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-utilities\") pod \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.382253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-catalog-content\") pod \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.382363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7rw\" (UniqueName: \"kubernetes.io/projected/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-kube-api-access-qf7rw\") pod \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\" (UID: \"b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561\") " Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.383093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-utilities" (OuterVolumeSpecName: "utilities") pod "b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" (UID: "b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.387569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-kube-api-access-qf7rw" (OuterVolumeSpecName: "kube-api-access-qf7rw") pod "b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" (UID: "b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561"). InnerVolumeSpecName "kube-api-access-qf7rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.411790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" (UID: "b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.463221 4749 scope.go:117] "RemoveContainer" containerID="fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd" Mar 10 17:18:58 crc kubenswrapper[4749]: E0310 17:18:58.463973 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd\": container with ID starting with fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd not found: ID does not exist" containerID="fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.464053 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd"} err="failed to get container status \"fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd\": rpc error: code = NotFound desc = could not find container \"fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd\": container with ID starting with fd777b73771b6adb1f435aac1acb45543ebf9a78efeb392ae5684555814e39dd not found: ID does not exist" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.464081 4749 scope.go:117] "RemoveContainer" containerID="a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420" Mar 10 17:18:58 crc kubenswrapper[4749]: E0310 17:18:58.473986 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420\": container with ID starting with a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420 not found: ID does not exist" containerID="a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.474030 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420"} err="failed to get container status \"a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420\": rpc error: code = NotFound desc = could not find container \"a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420\": container with ID starting with a2b96fca0a84ccb5a67da4e58bc94ee5bf7e63a65eb40666f52ef4823e316420 not found: ID does not exist" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.474058 4749 scope.go:117] "RemoveContainer" containerID="df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5" Mar 10 17:18:58 crc kubenswrapper[4749]: E0310 17:18:58.474885 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5\": container with ID starting with df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5 not found: ID does not exist" containerID="df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.474912 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5"} err="failed to get container status \"df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5\": rpc error: code = NotFound desc = could not find container \"df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5\": container with ID starting with df155de2d5ff5e821f5d3314dee9ab93e037ef29c0b71df934837834c04214a5 not found: ID does not exist" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.484655 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf7rw\" (UniqueName: \"kubernetes.io/projected/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-kube-api-access-qf7rw\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.484734 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.484748 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.674592 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfsj"] Mar 10 17:18:58 crc kubenswrapper[4749]: I0310 17:18:58.681136 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfsj"] Mar 10 17:18:59 crc kubenswrapper[4749]: I0310 17:18:59.616727 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" path="/var/lib/kubelet/pods/b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561/volumes" Mar 10 17:19:04 crc kubenswrapper[4749]: I0310 17:19:04.606694 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:19:04 crc kubenswrapper[4749]: E0310 17:19:04.607571 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:19:18 crc kubenswrapper[4749]: I0310 17:19:18.607764 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:19:18 crc kubenswrapper[4749]: E0310 17:19:18.608724 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:19:25 crc kubenswrapper[4749]: I0310 17:19:25.335938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-75498b5f58-p5jp4" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.538239 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 17:19:29 crc kubenswrapper[4749]: E0310 17:19:29.539312 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="extract-utilities" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.539325 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="extract-utilities" Mar 10 17:19:29 crc kubenswrapper[4749]: E0310 17:19:29.539335 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="extract-content" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.539341 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="extract-content" Mar 10 17:19:29 crc kubenswrapper[4749]: E0310 17:19:29.539356 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="registry-server" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.539362 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="registry-server" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.539562 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b9b2c9-2767-4ca6-8c9e-4dc01abc3561" containerName="registry-server" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.540142 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.542941 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.543118 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.543656 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lrl7c" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.552747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.654021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c365107c-5e98-4f1e-abf4-9efe9e71de6c-openstack-config\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.654447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c365107c-5e98-4f1e-abf4-9efe9e71de6c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.654485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c365107c-5e98-4f1e-abf4-9efe9e71de6c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.654551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtd8j\" (UniqueName: \"kubernetes.io/projected/c365107c-5e98-4f1e-abf4-9efe9e71de6c-kube-api-access-mtd8j\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.755858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtd8j\" (UniqueName: \"kubernetes.io/projected/c365107c-5e98-4f1e-abf4-9efe9e71de6c-kube-api-access-mtd8j\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.755997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c365107c-5e98-4f1e-abf4-9efe9e71de6c-openstack-config\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.756078 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c365107c-5e98-4f1e-abf4-9efe9e71de6c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.756116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c365107c-5e98-4f1e-abf4-9efe9e71de6c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.757153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c365107c-5e98-4f1e-abf4-9efe9e71de6c-openstack-config\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.761922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c365107c-5e98-4f1e-abf4-9efe9e71de6c-openstack-config-secret\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.762068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c365107c-5e98-4f1e-abf4-9efe9e71de6c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.782009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtd8j\" (UniqueName: \"kubernetes.io/projected/c365107c-5e98-4f1e-abf4-9efe9e71de6c-kube-api-access-mtd8j\") pod \"openstackclient\" (UID: \"c365107c-5e98-4f1e-abf4-9efe9e71de6c\") " pod="openstack/openstackclient" Mar 10 17:19:29 crc kubenswrapper[4749]: I0310 17:19:29.899848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 17:19:30 crc kubenswrapper[4749]: I0310 17:19:30.338688 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 17:19:30 crc kubenswrapper[4749]: I0310 17:19:30.607213 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:19:30 crc kubenswrapper[4749]: E0310 17:19:30.608277 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:19:30 crc kubenswrapper[4749]: I0310 17:19:30.633120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c365107c-5e98-4f1e-abf4-9efe9e71de6c","Type":"ContainerStarted","Data":"615c22658186f73e6daf937cfcfe9222ee900515b993572ffe6f23642705c740"} Mar 10 17:19:30 crc kubenswrapper[4749]: I0310 17:19:30.633518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c365107c-5e98-4f1e-abf4-9efe9e71de6c","Type":"ContainerStarted","Data":"9ca58103a177d274e40037bb56190d38745bda309a36680e7dc946234b768c42"} Mar 10 17:19:30 crc kubenswrapper[4749]: I0310 17:19:30.653873 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.653849001 podStartE2EDuration="1.653849001s" podCreationTimestamp="2026-03-10 17:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 17:19:30.651642831 +0000 UTC m=+5467.773508538" watchObservedRunningTime="2026-03-10 17:19:30.653849001 +0000 UTC m=+5467.775714688" Mar 10 17:19:42 crc kubenswrapper[4749]: I0310 17:19:42.606566 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:19:42 crc kubenswrapper[4749]: E0310 17:19:42.607109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:19:53 crc kubenswrapper[4749]: I0310 17:19:53.612933 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:19:53 crc kubenswrapper[4749]: E0310 17:19:53.613685 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.143110 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552720-mzvxb"] Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.144733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.148242 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.148557 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.148586 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.152483 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552720-mzvxb"] Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.327215 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdn9f\" (UniqueName: \"kubernetes.io/projected/a558e344-f2c6-471b-94ea-7ace7dd44b18-kube-api-access-sdn9f\") pod \"auto-csr-approver-29552720-mzvxb\" (UID: \"a558e344-f2c6-471b-94ea-7ace7dd44b18\") " pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.428534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdn9f\" (UniqueName: \"kubernetes.io/projected/a558e344-f2c6-471b-94ea-7ace7dd44b18-kube-api-access-sdn9f\") pod \"auto-csr-approver-29552720-mzvxb\" (UID: \"a558e344-f2c6-471b-94ea-7ace7dd44b18\") " pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.448791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdn9f\" (UniqueName: \"kubernetes.io/projected/a558e344-f2c6-471b-94ea-7ace7dd44b18-kube-api-access-sdn9f\") pod \"auto-csr-approver-29552720-mzvxb\" (UID: \"a558e344-f2c6-471b-94ea-7ace7dd44b18\") " pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.473155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:00 crc kubenswrapper[4749]: I0310 17:20:00.908756 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552720-mzvxb"] Mar 10 17:20:01 crc kubenswrapper[4749]: I0310 17:20:01.876627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" event={"ID":"a558e344-f2c6-471b-94ea-7ace7dd44b18","Type":"ContainerStarted","Data":"0a36170dfc77fc5b5e1320aa1e165c65e34d6529cec2eb6766877eeae43eb5fe"} Mar 10 17:20:02 crc kubenswrapper[4749]: I0310 17:20:02.886954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" event={"ID":"a558e344-f2c6-471b-94ea-7ace7dd44b18","Type":"ContainerStarted","Data":"22e44cdbb944428ed4dc01a31007cabbdd7e402993d148f14678e3463f0ce70a"} Mar 10 17:20:02 crc kubenswrapper[4749]: I0310 17:20:02.904148 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" podStartSLOduration=1.431491796 podStartE2EDuration="2.904128602s" podCreationTimestamp="2026-03-10 17:20:00 +0000 UTC" firstStartedPulling="2026-03-10 17:20:00.912998505 +0000 UTC m=+5498.034864192" lastFinishedPulling="2026-03-10 17:20:02.385635311 +0000 UTC m=+5499.507500998" observedRunningTime="2026-03-10 17:20:02.899949439 +0000 UTC m=+5500.021815126" watchObservedRunningTime="2026-03-10 17:20:02.904128602 +0000 UTC m=+5500.025994289" Mar 10 17:20:03 crc kubenswrapper[4749]: I0310 17:20:03.898359 4749 generic.go:334] "Generic (PLEG): container finished" podID="a558e344-f2c6-471b-94ea-7ace7dd44b18" containerID="22e44cdbb944428ed4dc01a31007cabbdd7e402993d148f14678e3463f0ce70a" exitCode=0 Mar 10 17:20:03 crc kubenswrapper[4749]: I0310 17:20:03.898478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" event={"ID":"a558e344-f2c6-471b-94ea-7ace7dd44b18","Type":"ContainerDied","Data":"22e44cdbb944428ed4dc01a31007cabbdd7e402993d148f14678e3463f0ce70a"} Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.196510 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.338701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdn9f\" (UniqueName: \"kubernetes.io/projected/a558e344-f2c6-471b-94ea-7ace7dd44b18-kube-api-access-sdn9f\") pod \"a558e344-f2c6-471b-94ea-7ace7dd44b18\" (UID: \"a558e344-f2c6-471b-94ea-7ace7dd44b18\") " Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.344448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a558e344-f2c6-471b-94ea-7ace7dd44b18-kube-api-access-sdn9f" (OuterVolumeSpecName: "kube-api-access-sdn9f") pod "a558e344-f2c6-471b-94ea-7ace7dd44b18" (UID: "a558e344-f2c6-471b-94ea-7ace7dd44b18"). InnerVolumeSpecName "kube-api-access-sdn9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.440991 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdn9f\" (UniqueName: \"kubernetes.io/projected/a558e344-f2c6-471b-94ea-7ace7dd44b18-kube-api-access-sdn9f\") on node \"crc\" DevicePath \"\"" Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.917138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" event={"ID":"a558e344-f2c6-471b-94ea-7ace7dd44b18","Type":"ContainerDied","Data":"0a36170dfc77fc5b5e1320aa1e165c65e34d6529cec2eb6766877eeae43eb5fe"} Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.917196 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a36170dfc77fc5b5e1320aa1e165c65e34d6529cec2eb6766877eeae43eb5fe" Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.917819 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552720-mzvxb" Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.970941 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552714-ck9tm"] Mar 10 17:20:05 crc kubenswrapper[4749]: I0310 17:20:05.979625 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552714-ck9tm"] Mar 10 17:20:06 crc kubenswrapper[4749]: I0310 17:20:06.607928 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:20:06 crc kubenswrapper[4749]: E0310 17:20:06.608297 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:20:07 crc kubenswrapper[4749]: I0310 17:20:07.617552 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dacc0ccc-aa71-4852-9518-e718f3491b86" path="/var/lib/kubelet/pods/dacc0ccc-aa71-4852-9518-e718f3491b86/volumes" Mar 10 17:20:20 crc kubenswrapper[4749]: I0310 17:20:20.607038 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:20:20 crc kubenswrapper[4749]: E0310 17:20:20.608062 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:20:27 crc kubenswrapper[4749]: I0310 17:20:27.590158 4749 scope.go:117] "RemoveContainer" containerID="512fd9b6b90f9558e3b5ceb36d336b3da56f827d0695b9132c302430e3574925" Mar 10 17:20:31 crc kubenswrapper[4749]: I0310 17:20:31.607017 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:20:31 crc kubenswrapper[4749]: E0310 17:20:31.607802 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:20:44 crc kubenswrapper[4749]: I0310 17:20:44.606545 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:20:44 crc kubenswrapper[4749]: E0310 17:20:44.607396 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:20:58 crc kubenswrapper[4749]: I0310 17:20:58.606713 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:20:58 crc kubenswrapper[4749]: E0310 17:20:58.607616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:21:09 crc kubenswrapper[4749]: I0310 17:21:09.607020 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:21:09 crc kubenswrapper[4749]: E0310 17:21:09.624960 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:21:22 crc kubenswrapper[4749]: I0310 17:21:22.607502 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:21:22 crc kubenswrapper[4749]: E0310 17:21:22.608260 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:21:34 crc kubenswrapper[4749]: I0310 17:21:34.607041 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:21:34 crc kubenswrapper[4749]: E0310 17:21:34.607865 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:21:49 crc kubenswrapper[4749]: I0310 17:21:49.608317 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:21:49 crc kubenswrapper[4749]: E0310 17:21:49.609599 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.164070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552722-qcwsm"] Mar 10 17:22:00 crc kubenswrapper[4749]: E0310 17:22:00.165315 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a558e344-f2c6-471b-94ea-7ace7dd44b18" containerName="oc" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.165331 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a558e344-f2c6-471b-94ea-7ace7dd44b18" containerName="oc" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.165549 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a558e344-f2c6-471b-94ea-7ace7dd44b18" containerName="oc" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.166228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.171073 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.171686 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.172625 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.177436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552722-qcwsm"] Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.247014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ztd\" (UniqueName: \"kubernetes.io/projected/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb-kube-api-access-j6ztd\") pod \"auto-csr-approver-29552722-qcwsm\" (UID: \"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb\") " pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.348551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ztd\" (UniqueName: \"kubernetes.io/projected/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb-kube-api-access-j6ztd\") pod \"auto-csr-approver-29552722-qcwsm\" (UID: \"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb\") " pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.368624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ztd\" (UniqueName: \"kubernetes.io/projected/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb-kube-api-access-j6ztd\") pod \"auto-csr-approver-29552722-qcwsm\" (UID: \"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb\") " pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.510731 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:00 crc kubenswrapper[4749]: I0310 17:22:00.947974 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552722-qcwsm"] Mar 10 17:22:01 crc kubenswrapper[4749]: I0310 17:22:01.216180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" event={"ID":"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb","Type":"ContainerStarted","Data":"6602c886d89cb6f4399b8452bec1d03fa30480bebd7d20bf3bea35456b4225f2"} Mar 10 17:22:02 crc kubenswrapper[4749]: I0310 17:22:02.607273 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:22:02 crc kubenswrapper[4749]: E0310 17:22:02.608037 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:22:03 crc kubenswrapper[4749]: I0310 17:22:03.234176 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc87ef98-b6a1-48d2-87c8-cb79228e6ecb" containerID="18518adbd38e16abc8900975975a5ba227333c3d4513e034aa328c32fdf5502a" exitCode=0 Mar 10 17:22:03 crc kubenswrapper[4749]: I0310 17:22:03.234316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" event={"ID":"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb","Type":"ContainerDied","Data":"18518adbd38e16abc8900975975a5ba227333c3d4513e034aa328c32fdf5502a"} Mar 10 17:22:04 crc kubenswrapper[4749]: I0310 17:22:04.555524 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:04 crc kubenswrapper[4749]: I0310 17:22:04.724443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6ztd\" (UniqueName: \"kubernetes.io/projected/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb-kube-api-access-j6ztd\") pod \"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb\" (UID: \"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb\") " Mar 10 17:22:04 crc kubenswrapper[4749]: I0310 17:22:04.729952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb-kube-api-access-j6ztd" (OuterVolumeSpecName: "kube-api-access-j6ztd") pod "fc87ef98-b6a1-48d2-87c8-cb79228e6ecb" (UID: "fc87ef98-b6a1-48d2-87c8-cb79228e6ecb"). InnerVolumeSpecName "kube-api-access-j6ztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:22:04 crc kubenswrapper[4749]: I0310 17:22:04.826472 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6ztd\" (UniqueName: \"kubernetes.io/projected/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb-kube-api-access-j6ztd\") on node \"crc\" DevicePath \"\"" Mar 10 17:22:05 crc kubenswrapper[4749]: I0310 17:22:05.264645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" event={"ID":"fc87ef98-b6a1-48d2-87c8-cb79228e6ecb","Type":"ContainerDied","Data":"6602c886d89cb6f4399b8452bec1d03fa30480bebd7d20bf3bea35456b4225f2"} Mar 10 17:22:05 crc kubenswrapper[4749]: I0310 17:22:05.264689 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6602c886d89cb6f4399b8452bec1d03fa30480bebd7d20bf3bea35456b4225f2" Mar 10 17:22:05 crc kubenswrapper[4749]: I0310 17:22:05.264798 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552722-qcwsm" Mar 10 17:22:05 crc kubenswrapper[4749]: I0310 17:22:05.650526 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552716-rnffm"] Mar 10 17:22:05 crc kubenswrapper[4749]: I0310 17:22:05.655878 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552716-rnffm"] Mar 10 17:22:07 crc kubenswrapper[4749]: I0310 17:22:07.617237 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ed8b50-c65b-4d45-88f8-a5ba25b277c7" path="/var/lib/kubelet/pods/d0ed8b50-c65b-4d45-88f8-a5ba25b277c7/volumes" Mar 10 17:22:11 crc kubenswrapper[4749]: I0310 17:22:11.869192 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhfp8"] Mar 10 17:22:11 crc kubenswrapper[4749]: E0310 17:22:11.870853 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc87ef98-b6a1-48d2-87c8-cb79228e6ecb" containerName="oc" Mar 10 17:22:11 crc kubenswrapper[4749]: I0310 17:22:11.870881 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc87ef98-b6a1-48d2-87c8-cb79228e6ecb" containerName="oc" Mar 10 17:22:11 crc kubenswrapper[4749]: I0310 17:22:11.871085 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc87ef98-b6a1-48d2-87c8-cb79228e6ecb" containerName="oc" Mar 10 17:22:11 crc kubenswrapper[4749]: I0310 17:22:11.872510 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:11 crc kubenswrapper[4749]: I0310 17:22:11.886360 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhfp8"] Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.043916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq2b8\" (UniqueName: \"kubernetes.io/projected/2785c135-405d-460b-8201-552b59689f1b-kube-api-access-wq2b8\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.044065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-utilities\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.044132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-catalog-content\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.146127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq2b8\" (UniqueName: \"kubernetes.io/projected/2785c135-405d-460b-8201-552b59689f1b-kube-api-access-wq2b8\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.146273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-utilities\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.146347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-catalog-content\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.146790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-utilities\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.146804 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-catalog-content\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.171633 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq2b8\" (UniqueName: \"kubernetes.io/projected/2785c135-405d-460b-8201-552b59689f1b-kube-api-access-wq2b8\") pod \"community-operators-fhfp8\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.195444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:12 crc kubenswrapper[4749]: I0310 17:22:12.839884 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhfp8"] Mar 10 17:22:12 crc kubenswrapper[4749]: W0310 17:22:12.853464 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2785c135_405d_460b_8201_552b59689f1b.slice/crio-2d048729d895b2d2b93cb71594eaeee9898c99f733d7da59b3070fc67f567d1f WatchSource:0}: Error finding container 2d048729d895b2d2b93cb71594eaeee9898c99f733d7da59b3070fc67f567d1f: Status 404 returned error can't find the container with id 2d048729d895b2d2b93cb71594eaeee9898c99f733d7da59b3070fc67f567d1f Mar 10 17:22:13 crc kubenswrapper[4749]: I0310 17:22:13.335004 4749 generic.go:334] "Generic (PLEG): container finished" podID="2785c135-405d-460b-8201-552b59689f1b" containerID="4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c" exitCode=0 Mar 10 17:22:13 crc kubenswrapper[4749]: I0310 17:22:13.335265 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhfp8" event={"ID":"2785c135-405d-460b-8201-552b59689f1b","Type":"ContainerDied","Data":"4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c"} Mar 10 17:22:13 crc kubenswrapper[4749]: I0310 17:22:13.335721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhfp8" event={"ID":"2785c135-405d-460b-8201-552b59689f1b","Type":"ContainerStarted","Data":"2d048729d895b2d2b93cb71594eaeee9898c99f733d7da59b3070fc67f567d1f"} Mar 10 17:22:13 crc kubenswrapper[4749]: I0310 17:22:13.612739 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:22:13 crc kubenswrapper[4749]: E0310 17:22:13.613020 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:22:15 crc kubenswrapper[4749]: I0310 17:22:15.352016 4749 generic.go:334] "Generic (PLEG): container finished" podID="2785c135-405d-460b-8201-552b59689f1b" containerID="01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1" exitCode=0 Mar 10 17:22:15 crc kubenswrapper[4749]: I0310 17:22:15.352218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhfp8" event={"ID":"2785c135-405d-460b-8201-552b59689f1b","Type":"ContainerDied","Data":"01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1"} Mar 10 17:22:16 crc kubenswrapper[4749]: I0310 17:22:16.052336 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7cq8f"] Mar 10 17:22:16 crc kubenswrapper[4749]: I0310 17:22:16.060069 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7cq8f"] Mar 10 17:22:16 crc kubenswrapper[4749]: I0310 17:22:16.363890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhfp8" event={"ID":"2785c135-405d-460b-8201-552b59689f1b","Type":"ContainerStarted","Data":"b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643"} Mar 10 17:22:16 crc kubenswrapper[4749]: I0310 17:22:16.383249 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhfp8" podStartSLOduration=2.978461293 podStartE2EDuration="5.383230574s" podCreationTimestamp="2026-03-10 17:22:11 +0000 UTC" firstStartedPulling="2026-03-10 17:22:13.337769891 +0000 UTC m=+5630.459635578" lastFinishedPulling="2026-03-10 17:22:15.742539162 +0000 UTC m=+5632.864404859" observedRunningTime="2026-03-10 17:22:16.38019566 +0000 UTC m=+5633.502061357" watchObservedRunningTime="2026-03-10 17:22:16.383230574 +0000 UTC m=+5633.505096261" Mar 10 17:22:17 crc kubenswrapper[4749]: I0310 17:22:17.620549 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a2202b-2b1d-4d85-909e-004bd8fa1d66" path="/var/lib/kubelet/pods/d7a2202b-2b1d-4d85-909e-004bd8fa1d66/volumes" Mar 10 17:22:22 crc kubenswrapper[4749]: I0310 17:22:22.196251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:22 crc kubenswrapper[4749]: I0310 17:22:22.196750 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:22 crc kubenswrapper[4749]: I0310 17:22:22.238397 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:22 crc kubenswrapper[4749]: I0310 17:22:22.461391 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:22 crc kubenswrapper[4749]: I0310 17:22:22.507272 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhfp8"] Mar 10 17:22:24 crc kubenswrapper[4749]: I0310 17:22:24.426872 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhfp8" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="registry-server" containerID="cri-o://b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643" gracePeriod=2 Mar 10 17:22:24 crc kubenswrapper[4749]: I0310 17:22:24.987446 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.110728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-catalog-content\") pod \"2785c135-405d-460b-8201-552b59689f1b\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.110920 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-utilities\") pod \"2785c135-405d-460b-8201-552b59689f1b\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.110990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq2b8\" (UniqueName: \"kubernetes.io/projected/2785c135-405d-460b-8201-552b59689f1b-kube-api-access-wq2b8\") pod \"2785c135-405d-460b-8201-552b59689f1b\" (UID: \"2785c135-405d-460b-8201-552b59689f1b\") " Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.117772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2785c135-405d-460b-8201-552b59689f1b-kube-api-access-wq2b8" (OuterVolumeSpecName: "kube-api-access-wq2b8") pod "2785c135-405d-460b-8201-552b59689f1b" (UID: "2785c135-405d-460b-8201-552b59689f1b"). InnerVolumeSpecName "kube-api-access-wq2b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.118742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-utilities" (OuterVolumeSpecName: "utilities") pod "2785c135-405d-460b-8201-552b59689f1b" (UID: "2785c135-405d-460b-8201-552b59689f1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.175887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2785c135-405d-460b-8201-552b59689f1b" (UID: "2785c135-405d-460b-8201-552b59689f1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.213612 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.213646 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq2b8\" (UniqueName: \"kubernetes.io/projected/2785c135-405d-460b-8201-552b59689f1b-kube-api-access-wq2b8\") on node \"crc\" DevicePath \"\"" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.213657 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2785c135-405d-460b-8201-552b59689f1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.440511 4749 generic.go:334] "Generic (PLEG): container finished" podID="2785c135-405d-460b-8201-552b59689f1b" containerID="b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643" exitCode=0 Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.440576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhfp8" event={"ID":"2785c135-405d-460b-8201-552b59689f1b","Type":"ContainerDied","Data":"b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643"} Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.440601 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhfp8" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.440626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhfp8" event={"ID":"2785c135-405d-460b-8201-552b59689f1b","Type":"ContainerDied","Data":"2d048729d895b2d2b93cb71594eaeee9898c99f733d7da59b3070fc67f567d1f"} Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.440652 4749 scope.go:117] "RemoveContainer" containerID="b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.466288 4749 scope.go:117] "RemoveContainer" containerID="01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.495926 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhfp8"] Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.506548 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhfp8"] Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.522243 4749 scope.go:117] "RemoveContainer" containerID="4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.543112 4749 scope.go:117] "RemoveContainer" containerID="b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643" Mar 10 17:22:25 crc kubenswrapper[4749]: E0310 17:22:25.543670 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643\": container with ID starting with b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643 not found: ID does not exist" containerID="b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.543892 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643"} err="failed to get container status \"b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643\": rpc error: code = NotFound desc = could not find container \"b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643\": container with ID starting with b4a8bb583fe65030b88834c1e2827cee638c95e727fe45d041fef93764b97643 not found: ID does not exist" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.543922 4749 scope.go:117] "RemoveContainer" containerID="01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1" Mar 10 17:22:25 crc kubenswrapper[4749]: E0310 17:22:25.544180 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1\": container with ID starting with 01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1 not found: ID does not exist" containerID="01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.544227 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1"} err="failed to get container status \"01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1\": rpc error: code = NotFound desc = could not find container \"01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1\": container with ID starting with 01ffc1dac2df5f35c6520de01e8cb2e22e6b3ff9493f8c517a14c0835e3796b1 not found: ID does not exist" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.544253 4749 scope.go:117] "RemoveContainer" containerID="4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c" Mar 10 17:22:25 crc kubenswrapper[4749]: E0310 17:22:25.544527 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c\": container with ID starting with 4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c not found: ID does not exist" containerID="4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.544561 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c"} err="failed to get container status \"4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c\": rpc error: code = NotFound desc = could not find container \"4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c\": container with ID starting with 4b0dfc58c7d9526241e533f8969c147296fc50b1c66ba2bf02e1ce36c317fb8c not found: ID does not exist" Mar 10 17:22:25 crc kubenswrapper[4749]: I0310 17:22:25.619329 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2785c135-405d-460b-8201-552b59689f1b" path="/var/lib/kubelet/pods/2785c135-405d-460b-8201-552b59689f1b/volumes" Mar 10 17:22:27 crc kubenswrapper[4749]: I0310 17:22:27.607135 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:22:27 crc kubenswrapper[4749]: E0310 17:22:27.607659 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:22:27 crc kubenswrapper[4749]: I0310 17:22:27.690537 4749 scope.go:117] "RemoveContainer" containerID="b357b1fbcc0cb613556e38f737fa7dab83e3e108a6f7630b0ef078d622e82ef6" Mar 10 17:22:27 crc kubenswrapper[4749]: I0310 17:22:27.734416 4749 scope.go:117] "RemoveContainer" containerID="a627c7d51a6873e42fa60a483d9edf9b88fbd5393208fe23619119016e253dc0" Mar 10 17:22:38 crc kubenswrapper[4749]: I0310 17:22:38.607096 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:22:38 crc kubenswrapper[4749]: E0310 17:22:38.608017 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:22:50 crc kubenswrapper[4749]: I0310 17:22:50.607134 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:22:50 crc kubenswrapper[4749]: E0310 17:22:50.607909 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:23:01 crc kubenswrapper[4749]: I0310 17:23:01.606566 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:23:02 crc kubenswrapper[4749]: I0310 17:23:02.760282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"2d05271c71f4359f91a631f79b2084ee3a82bd748331985a2adec121e2772060"} Mar 10 17:23:27 crc kubenswrapper[4749]: I0310 17:23:27.809518 4749 scope.go:117] "RemoveContainer" containerID="b5051e943235eadb455ac5ff670027f6ebb8544451fa21ee2d9b2a03435e1bde" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.614151 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nxkpk"] Mar 10 17:23:54 crc kubenswrapper[4749]: E0310 17:23:54.615057 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="extract-utilities" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.615073 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="extract-utilities" Mar 10 17:23:54 crc kubenswrapper[4749]: E0310 17:23:54.615110 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="extract-content" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.615119 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="extract-content" Mar 10 17:23:54 crc kubenswrapper[4749]: E0310 17:23:54.615134 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="registry-server" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.615142 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="registry-server" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.615369 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2785c135-405d-460b-8201-552b59689f1b" containerName="registry-server" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.616856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.624642 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nxkpk"] Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.722605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-utilities\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.722938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-catalog-content\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.723058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wmm\" (UniqueName: \"kubernetes.io/projected/583e155b-eac1-4cd9-8fbb-c35b81f13203-kube-api-access-h4wmm\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.824598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-utilities\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.824662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-catalog-content\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.824691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wmm\" (UniqueName: \"kubernetes.io/projected/583e155b-eac1-4cd9-8fbb-c35b81f13203-kube-api-access-h4wmm\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.825190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-utilities\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.825241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-catalog-content\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.853106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4wmm\" (UniqueName: \"kubernetes.io/projected/583e155b-eac1-4cd9-8fbb-c35b81f13203-kube-api-access-h4wmm\") pod \"certified-operators-nxkpk\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:54 crc kubenswrapper[4749]: I0310 17:23:54.948276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:23:55 crc kubenswrapper[4749]: I0310 17:23:55.415084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nxkpk"] Mar 10 17:23:56 crc kubenswrapper[4749]: I0310 17:23:56.203523 4749 generic.go:334] "Generic (PLEG): container finished" podID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerID="8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c" exitCode=0 Mar 10 17:23:56 crc kubenswrapper[4749]: I0310 17:23:56.203606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkpk" event={"ID":"583e155b-eac1-4cd9-8fbb-c35b81f13203","Type":"ContainerDied","Data":"8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c"} Mar 10 17:23:56 crc kubenswrapper[4749]: I0310 17:23:56.203814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkpk" event={"ID":"583e155b-eac1-4cd9-8fbb-c35b81f13203","Type":"ContainerStarted","Data":"f8801ed46b9542e5fd5688694455bb10722c9c03d923e363322c4cc0c15298a1"} Mar 10 17:23:56 crc kubenswrapper[4749]: I0310 17:23:56.205808 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:23:57 crc kubenswrapper[4749]: I0310 17:23:57.212731 4749 generic.go:334] "Generic (PLEG): container finished" podID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerID="02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e" exitCode=0 Mar 10 17:23:57 crc kubenswrapper[4749]: I0310 17:23:57.212882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkpk" event={"ID":"583e155b-eac1-4cd9-8fbb-c35b81f13203","Type":"ContainerDied","Data":"02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e"} Mar 10 17:23:58 crc kubenswrapper[4749]: I0310 17:23:58.223209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkpk" event={"ID":"583e155b-eac1-4cd9-8fbb-c35b81f13203","Type":"ContainerStarted","Data":"3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396"} Mar 10 17:23:58 crc kubenswrapper[4749]: I0310 17:23:58.248200 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nxkpk" podStartSLOduration=2.790661279 podStartE2EDuration="4.248182002s" podCreationTimestamp="2026-03-10 17:23:54 +0000 UTC" firstStartedPulling="2026-03-10 17:23:56.205553632 +0000 UTC m=+5733.327419329" lastFinishedPulling="2026-03-10 17:23:57.663074355 +0000 UTC m=+5734.784940052" observedRunningTime="2026-03-10 17:23:58.243493504 +0000 UTC m=+5735.365359211" watchObservedRunningTime="2026-03-10 17:23:58.248182002 +0000 UTC m=+5735.370047689" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.152590 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552724-8wvpn"] Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.155640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.157954 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.158364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.160274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.168282 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552724-8wvpn"] Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.222579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2b98\" (UniqueName: \"kubernetes.io/projected/989a1514-fe33-4892-9ad4-d9d471fe007c-kube-api-access-m2b98\") pod \"auto-csr-approver-29552724-8wvpn\" (UID: \"989a1514-fe33-4892-9ad4-d9d471fe007c\") " pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.326007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2b98\" (UniqueName: \"kubernetes.io/projected/989a1514-fe33-4892-9ad4-d9d471fe007c-kube-api-access-m2b98\") pod \"auto-csr-approver-29552724-8wvpn\" (UID: \"989a1514-fe33-4892-9ad4-d9d471fe007c\") " pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.349698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2b98\" (UniqueName: \"kubernetes.io/projected/989a1514-fe33-4892-9ad4-d9d471fe007c-kube-api-access-m2b98\") pod \"auto-csr-approver-29552724-8wvpn\" (UID: \"989a1514-fe33-4892-9ad4-d9d471fe007c\") " pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.484036 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:00 crc kubenswrapper[4749]: I0310 17:24:00.938966 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552724-8wvpn"] Mar 10 17:24:00 crc kubenswrapper[4749]: W0310 17:24:00.950405 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod989a1514_fe33_4892_9ad4_d9d471fe007c.slice/crio-c4159e80157647dba229b76f2e3f313a2ba6f6ba6869149eaf73e6f76de5acf3 WatchSource:0}: Error finding container c4159e80157647dba229b76f2e3f313a2ba6f6ba6869149eaf73e6f76de5acf3: Status 404 returned error can't find the container with id c4159e80157647dba229b76f2e3f313a2ba6f6ba6869149eaf73e6f76de5acf3 Mar 10 17:24:01 crc kubenswrapper[4749]: I0310 17:24:01.254215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" event={"ID":"989a1514-fe33-4892-9ad4-d9d471fe007c","Type":"ContainerStarted","Data":"c4159e80157647dba229b76f2e3f313a2ba6f6ba6869149eaf73e6f76de5acf3"} Mar 10 17:24:03 crc kubenswrapper[4749]: I0310 17:24:03.277290 4749 generic.go:334] "Generic (PLEG): container finished" podID="989a1514-fe33-4892-9ad4-d9d471fe007c" containerID="72c26045a87be0ede557dd4d3f87f986512799cc8c615cc6799f1a6fc6fd985e" exitCode=0 Mar 10 17:24:03 crc kubenswrapper[4749]: I0310 17:24:03.277406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" event={"ID":"989a1514-fe33-4892-9ad4-d9d471fe007c","Type":"ContainerDied","Data":"72c26045a87be0ede557dd4d3f87f986512799cc8c615cc6799f1a6fc6fd985e"} Mar 10 17:24:04 crc kubenswrapper[4749]: I0310 17:24:04.604164 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:04 crc kubenswrapper[4749]: I0310 17:24:04.712340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2b98\" (UniqueName: \"kubernetes.io/projected/989a1514-fe33-4892-9ad4-d9d471fe007c-kube-api-access-m2b98\") pod \"989a1514-fe33-4892-9ad4-d9d471fe007c\" (UID: \"989a1514-fe33-4892-9ad4-d9d471fe007c\") " Mar 10 17:24:04 crc kubenswrapper[4749]: I0310 17:24:04.718363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989a1514-fe33-4892-9ad4-d9d471fe007c-kube-api-access-m2b98" (OuterVolumeSpecName: "kube-api-access-m2b98") pod "989a1514-fe33-4892-9ad4-d9d471fe007c" (UID: "989a1514-fe33-4892-9ad4-d9d471fe007c"). InnerVolumeSpecName "kube-api-access-m2b98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:24:04 crc kubenswrapper[4749]: I0310 17:24:04.814730 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2b98\" (UniqueName: \"kubernetes.io/projected/989a1514-fe33-4892-9ad4-d9d471fe007c-kube-api-access-m2b98\") on node \"crc\" DevicePath \"\"" Mar 10 17:24:04 crc kubenswrapper[4749]: I0310 17:24:04.949258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:24:04 crc kubenswrapper[4749]: I0310 17:24:04.949332 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.020398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.295582 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.295656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552724-8wvpn" event={"ID":"989a1514-fe33-4892-9ad4-d9d471fe007c","Type":"ContainerDied","Data":"c4159e80157647dba229b76f2e3f313a2ba6f6ba6869149eaf73e6f76de5acf3"} Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.295694 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4159e80157647dba229b76f2e3f313a2ba6f6ba6869149eaf73e6f76de5acf3" Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.348105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.393169 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nxkpk"] Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.686247 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552718-klb5c"] Mar 10 17:24:05 crc kubenswrapper[4749]: I0310 17:24:05.701763 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552718-klb5c"] Mar 10 17:24:07 crc kubenswrapper[4749]: I0310 17:24:07.311159 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nxkpk" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="registry-server" containerID="cri-o://3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396" gracePeriod=2 Mar 10 17:24:07 crc kubenswrapper[4749]: I0310 17:24:07.617281 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a746a1c-dd59-4d02-a198-a9d8b239947d" path="/var/lib/kubelet/pods/6a746a1c-dd59-4d02-a198-a9d8b239947d/volumes" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.210502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.323138 4749 generic.go:334] "Generic (PLEG): container finished" podID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerID="3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396" exitCode=0 Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.323205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkpk" event={"ID":"583e155b-eac1-4cd9-8fbb-c35b81f13203","Type":"ContainerDied","Data":"3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396"} Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.323249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxkpk" event={"ID":"583e155b-eac1-4cd9-8fbb-c35b81f13203","Type":"ContainerDied","Data":"f8801ed46b9542e5fd5688694455bb10722c9c03d923e363322c4cc0c15298a1"} Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.323276 4749 scope.go:117] "RemoveContainer" containerID="3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.323505 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxkpk" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.349564 4749 scope.go:117] "RemoveContainer" containerID="02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.378568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-catalog-content\") pod \"583e155b-eac1-4cd9-8fbb-c35b81f13203\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.378646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4wmm\" (UniqueName: \"kubernetes.io/projected/583e155b-eac1-4cd9-8fbb-c35b81f13203-kube-api-access-h4wmm\") pod \"583e155b-eac1-4cd9-8fbb-c35b81f13203\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.378856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-utilities\") pod \"583e155b-eac1-4cd9-8fbb-c35b81f13203\" (UID: \"583e155b-eac1-4cd9-8fbb-c35b81f13203\") " Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.379641 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-utilities" (OuterVolumeSpecName: "utilities") pod "583e155b-eac1-4cd9-8fbb-c35b81f13203" (UID: "583e155b-eac1-4cd9-8fbb-c35b81f13203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.380833 4749 scope.go:117] "RemoveContainer" containerID="8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.385313 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583e155b-eac1-4cd9-8fbb-c35b81f13203-kube-api-access-h4wmm" (OuterVolumeSpecName: "kube-api-access-h4wmm") pod "583e155b-eac1-4cd9-8fbb-c35b81f13203" (UID: "583e155b-eac1-4cd9-8fbb-c35b81f13203"). InnerVolumeSpecName "kube-api-access-h4wmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.463607 4749 scope.go:117] "RemoveContainer" containerID="3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396" Mar 10 17:24:08 crc kubenswrapper[4749]: E0310 17:24:08.464917 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396\": container with ID starting with 3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396 not found: ID does not exist" containerID="3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.464987 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396"} err="failed to get container status \"3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396\": rpc error: code = NotFound desc = could not find container \"3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396\": container with ID starting with 3ee06dc869aabecff32137585b1d05b95551f089bab083f40d20ad5dffdb6396 not found: ID does not exist" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.465026 4749 scope.go:117] "RemoveContainer" containerID="02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e" Mar 10 17:24:08 crc kubenswrapper[4749]: E0310 17:24:08.465733 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e\": container with ID starting with 02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e not found: ID does not exist" containerID="02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.465860 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e"} err="failed to get container status \"02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e\": rpc error: code = NotFound desc = could not find container \"02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e\": container with ID starting with 02bb106e21ff7013c894b0f8fe5edf39d453f1575fa5a17a533d84ba3aa6485e not found: ID does not exist" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.465947 4749 scope.go:117] "RemoveContainer" containerID="8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c" Mar 10 17:24:08 crc kubenswrapper[4749]: E0310 17:24:08.466601 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c\": container with ID starting with 8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c not found: ID does not exist" containerID="8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.466667 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c"} err="failed to get container status \"8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c\": rpc error: code = NotFound desc = could not find container \"8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c\": container with ID starting with 8aeca0bafef8378c40b17b89a3533a0b19c0e8fcfa89937255911e24da95dc5c not found: ID does not exist" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.481186 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4wmm\" (UniqueName: \"kubernetes.io/projected/583e155b-eac1-4cd9-8fbb-c35b81f13203-kube-api-access-h4wmm\") on node \"crc\" DevicePath \"\"" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.481214 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.876229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "583e155b-eac1-4cd9-8fbb-c35b81f13203" (UID: "583e155b-eac1-4cd9-8fbb-c35b81f13203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.890512 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583e155b-eac1-4cd9-8fbb-c35b81f13203-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.985784 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nxkpk"] Mar 10 17:24:08 crc kubenswrapper[4749]: I0310 17:24:08.993776 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nxkpk"] Mar 10 17:24:09 crc kubenswrapper[4749]: E0310 17:24:09.087918 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod583e155b_eac1_4cd9_8fbb_c35b81f13203.slice\": RecentStats: unable to find data in memory cache]" Mar 10 17:24:09 crc kubenswrapper[4749]: I0310 17:24:09.616565 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" path="/var/lib/kubelet/pods/583e155b-eac1-4cd9-8fbb-c35b81f13203/volumes" Mar 10 17:24:27 crc kubenswrapper[4749]: I0310 17:24:27.871094 4749 scope.go:117] "RemoveContainer" containerID="76764a01ec7406824069e546672633adbc7cfe5d6ad7479b0e0cfa2945251261" Mar 10 17:24:27 crc kubenswrapper[4749]: I0310 17:24:27.911278 4749 scope.go:117] "RemoveContainer" containerID="75757afc171bcafeae6bf124f6d16942d183e49c261d76d64096dd63f8262253" Mar 10 17:25:20 crc kubenswrapper[4749]: I0310 17:25:20.980758 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:25:20 crc kubenswrapper[4749]: I0310 17:25:20.981253 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:25:27 crc kubenswrapper[4749]: I0310 17:25:27.991156 4749 scope.go:117] "RemoveContainer" containerID="0098d06c430536f716c2cdfcc19d3454b02b2b9cd3054fdd369bd6c9085c27ea" Mar 10 17:25:50 crc kubenswrapper[4749]: I0310 17:25:50.980170 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:25:50 crc kubenswrapper[4749]: I0310 17:25:50.980787 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.148827 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552726-fgcrs"] Mar 10 17:26:00 crc kubenswrapper[4749]: E0310 17:26:00.149623 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="extract-content" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.149645 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="extract-content" Mar 10 17:26:00 crc kubenswrapper[4749]: E0310 17:26:00.149679 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989a1514-fe33-4892-9ad4-d9d471fe007c" containerName="oc" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.149690 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="989a1514-fe33-4892-9ad4-d9d471fe007c" containerName="oc" Mar 10 17:26:00 crc kubenswrapper[4749]: E0310 17:26:00.149721 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="registry-server" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.149730 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="registry-server" Mar 10 17:26:00 crc kubenswrapper[4749]: E0310 17:26:00.149740 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="extract-utilities" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.149749 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="extract-utilities" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.149986 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="989a1514-fe33-4892-9ad4-d9d471fe007c" containerName="oc" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.150008 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="583e155b-eac1-4cd9-8fbb-c35b81f13203" containerName="registry-server" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.151032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.154276 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.154334 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.154746 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.156018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552726-fgcrs"] Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.289317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7rl\" (UniqueName: \"kubernetes.io/projected/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8-kube-api-access-9n7rl\") pod \"auto-csr-approver-29552726-fgcrs\" (UID: \"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8\") " pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.391052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7rl\" (UniqueName: \"kubernetes.io/projected/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8-kube-api-access-9n7rl\") pod \"auto-csr-approver-29552726-fgcrs\" (UID: \"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8\") " pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.424107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7rl\" (UniqueName: \"kubernetes.io/projected/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8-kube-api-access-9n7rl\") pod \"auto-csr-approver-29552726-fgcrs\" (UID: \"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8\") " pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.486856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:00 crc kubenswrapper[4749]: I0310 17:26:00.943271 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552726-fgcrs"] Mar 10 17:26:00 crc kubenswrapper[4749]: W0310 17:26:00.946485 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7c63e28_dfb1_47d3_9d15_9d7cda0161c8.slice/crio-cf13f78f49de2b6e6115d71ed5cb82e6fee348c97d97cdf92ecd1d01ad48d306 WatchSource:0}: Error finding container cf13f78f49de2b6e6115d71ed5cb82e6fee348c97d97cdf92ecd1d01ad48d306: Status 404 returned error can't find the container with id cf13f78f49de2b6e6115d71ed5cb82e6fee348c97d97cdf92ecd1d01ad48d306 Mar 10 17:26:01 crc kubenswrapper[4749]: I0310 17:26:01.285054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" event={"ID":"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8","Type":"ContainerStarted","Data":"cf13f78f49de2b6e6115d71ed5cb82e6fee348c97d97cdf92ecd1d01ad48d306"} Mar 10 17:26:04 crc kubenswrapper[4749]: I0310 17:26:04.314443 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7c63e28-dfb1-47d3-9d15-9d7cda0161c8" containerID="21542492f0bcb0f1a60cedb66c2497821a0b2d45ca1af38ad2320a8f28ff908e" exitCode=0 Mar 10 17:26:04 crc kubenswrapper[4749]: I0310 17:26:04.314720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" event={"ID":"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8","Type":"ContainerDied","Data":"21542492f0bcb0f1a60cedb66c2497821a0b2d45ca1af38ad2320a8f28ff908e"} Mar 10 17:26:05 crc kubenswrapper[4749]: I0310 17:26:05.667392 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:05 crc kubenswrapper[4749]: I0310 17:26:05.724613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n7rl\" (UniqueName: \"kubernetes.io/projected/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8-kube-api-access-9n7rl\") pod \"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8\" (UID: \"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8\") " Mar 10 17:26:05 crc kubenswrapper[4749]: I0310 17:26:05.733606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8-kube-api-access-9n7rl" (OuterVolumeSpecName: "kube-api-access-9n7rl") pod "a7c63e28-dfb1-47d3-9d15-9d7cda0161c8" (UID: "a7c63e28-dfb1-47d3-9d15-9d7cda0161c8"). InnerVolumeSpecName "kube-api-access-9n7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:26:05 crc kubenswrapper[4749]: I0310 17:26:05.826623 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n7rl\" (UniqueName: \"kubernetes.io/projected/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8-kube-api-access-9n7rl\") on node \"crc\" DevicePath \"\"" Mar 10 17:26:06 crc kubenswrapper[4749]: I0310 17:26:06.341716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" event={"ID":"a7c63e28-dfb1-47d3-9d15-9d7cda0161c8","Type":"ContainerDied","Data":"cf13f78f49de2b6e6115d71ed5cb82e6fee348c97d97cdf92ecd1d01ad48d306"} Mar 10 17:26:06 crc kubenswrapper[4749]: I0310 17:26:06.341777 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf13f78f49de2b6e6115d71ed5cb82e6fee348c97d97cdf92ecd1d01ad48d306" Mar 10 17:26:06 crc kubenswrapper[4749]: I0310 17:26:06.341806 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552726-fgcrs" Mar 10 17:26:06 crc kubenswrapper[4749]: I0310 17:26:06.745712 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552720-mzvxb"] Mar 10 17:26:06 crc kubenswrapper[4749]: I0310 17:26:06.753107 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552720-mzvxb"] Mar 10 17:26:07 crc kubenswrapper[4749]: I0310 17:26:07.622092 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a558e344-f2c6-471b-94ea-7ace7dd44b18" path="/var/lib/kubelet/pods/a558e344-f2c6-471b-94ea-7ace7dd44b18/volumes" Mar 10 17:26:20 crc kubenswrapper[4749]: I0310 17:26:20.980626 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:26:20 crc kubenswrapper[4749]: I0310 17:26:20.981172 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:26:20 crc kubenswrapper[4749]: I0310 17:26:20.981216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:26:20 crc kubenswrapper[4749]: I0310 17:26:20.981917 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d05271c71f4359f91a631f79b2084ee3a82bd748331985a2adec121e2772060"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:26:20 crc kubenswrapper[4749]: I0310 17:26:20.981972 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://2d05271c71f4359f91a631f79b2084ee3a82bd748331985a2adec121e2772060" gracePeriod=600 Mar 10 17:26:21 crc kubenswrapper[4749]: I0310 17:26:21.680608 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="2d05271c71f4359f91a631f79b2084ee3a82bd748331985a2adec121e2772060" exitCode=0 Mar 10 17:26:21 crc kubenswrapper[4749]: I0310 17:26:21.681156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"2d05271c71f4359f91a631f79b2084ee3a82bd748331985a2adec121e2772060"} Mar 10 17:26:21 crc kubenswrapper[4749]: I0310 17:26:21.681207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87"} Mar 10 17:26:21 crc kubenswrapper[4749]: I0310 17:26:21.681227 4749 scope.go:117] "RemoveContainer" containerID="b6ee05939f3b86bdc9a4fb58a1c15c77dfc1121dc26917a673d995805dbd5de8" Mar 10 17:26:28 crc kubenswrapper[4749]: I0310 17:26:28.074935 4749 scope.go:117] "RemoveContainer" containerID="22e44cdbb944428ed4dc01a31007cabbdd7e402993d148f14678e3463f0ce70a" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.117142 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8bbpn"] Mar 10 17:27:33 crc kubenswrapper[4749]: E0310 17:27:33.118199 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c63e28-dfb1-47d3-9d15-9d7cda0161c8" containerName="oc" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.118218 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c63e28-dfb1-47d3-9d15-9d7cda0161c8" containerName="oc" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.118435 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c63e28-dfb1-47d3-9d15-9d7cda0161c8" containerName="oc" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.120550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.133316 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bbpn"] Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.284462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-catalog-content\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.284792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-utilities\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.284823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7nhv\" (UniqueName: \"kubernetes.io/projected/cc128726-366c-4926-97f1-bd582dbb2a71-kube-api-access-b7nhv\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.385723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-utilities\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.385781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7nhv\" (UniqueName: \"kubernetes.io/projected/cc128726-366c-4926-97f1-bd582dbb2a71-kube-api-access-b7nhv\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.385910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-catalog-content\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.386543 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-catalog-content\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.386547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-utilities\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.410601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7nhv\" (UniqueName: \"kubernetes.io/projected/cc128726-366c-4926-97f1-bd582dbb2a71-kube-api-access-b7nhv\") pod \"redhat-operators-8bbpn\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.446002 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:33 crc kubenswrapper[4749]: I0310 17:27:33.898739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bbpn"] Mar 10 17:27:34 crc kubenswrapper[4749]: I0310 17:27:34.397242 4749 generic.go:334] "Generic (PLEG): container finished" podID="cc128726-366c-4926-97f1-bd582dbb2a71" containerID="abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5" exitCode=0 Mar 10 17:27:34 crc kubenswrapper[4749]: I0310 17:27:34.397314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerDied","Data":"abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5"} Mar 10 17:27:34 crc kubenswrapper[4749]: I0310 17:27:34.397537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerStarted","Data":"7fbde67eff35b49c22eff570b5171db1e3acf79ac47b448efa83d0651da4173c"} Mar 10 17:27:35 crc kubenswrapper[4749]: I0310 17:27:35.411165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerStarted","Data":"40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f"} Mar 10 17:27:36 crc kubenswrapper[4749]: I0310 17:27:36.423301 4749 generic.go:334] "Generic (PLEG): container finished" podID="cc128726-366c-4926-97f1-bd582dbb2a71" containerID="40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f" exitCode=0 Mar 10 17:27:36 crc kubenswrapper[4749]: I0310 17:27:36.423404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerDied","Data":"40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f"} Mar 10 17:27:37 crc kubenswrapper[4749]: I0310 17:27:37.433413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerStarted","Data":"a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd"} Mar 10 17:27:37 crc kubenswrapper[4749]: I0310 17:27:37.457850 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8bbpn" podStartSLOduration=2.003540442 podStartE2EDuration="4.457834961s" podCreationTimestamp="2026-03-10 17:27:33 +0000 UTC" firstStartedPulling="2026-03-10 17:27:34.399682805 +0000 UTC m=+5951.521548492" lastFinishedPulling="2026-03-10 17:27:36.853977314 +0000 UTC m=+5953.975843011" observedRunningTime="2026-03-10 17:27:37.451305124 +0000 UTC m=+5954.573170811" watchObservedRunningTime="2026-03-10 17:27:37.457834961 +0000 UTC m=+5954.579700648" Mar 10 17:27:43 crc kubenswrapper[4749]: I0310 17:27:43.446303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:43 crc kubenswrapper[4749]: I0310 17:27:43.447238 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:44 crc kubenswrapper[4749]: I0310 17:27:44.490132 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8bbpn" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="registry-server" probeResult="failure" output=< Mar 10 17:27:44 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Mar 10 17:27:44 crc kubenswrapper[4749]: > Mar 10 17:27:53 crc kubenswrapper[4749]: I0310 17:27:53.504643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:53 crc kubenswrapper[4749]: I0310 17:27:53.560202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:53 crc kubenswrapper[4749]: I0310 17:27:53.744221 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bbpn"] Mar 10 17:27:54 crc kubenswrapper[4749]: I0310 17:27:54.603355 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8bbpn" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="registry-server" containerID="cri-o://a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd" gracePeriod=2 Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.114187 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.147817 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-utilities\") pod \"cc128726-366c-4926-97f1-bd582dbb2a71\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.149601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-utilities" (OuterVolumeSpecName: "utilities") pod "cc128726-366c-4926-97f1-bd582dbb2a71" (UID: "cc128726-366c-4926-97f1-bd582dbb2a71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.249057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7nhv\" (UniqueName: \"kubernetes.io/projected/cc128726-366c-4926-97f1-bd582dbb2a71-kube-api-access-b7nhv\") pod \"cc128726-366c-4926-97f1-bd582dbb2a71\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.249110 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-catalog-content\") pod \"cc128726-366c-4926-97f1-bd582dbb2a71\" (UID: \"cc128726-366c-4926-97f1-bd582dbb2a71\") " Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.249439 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.257572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc128726-366c-4926-97f1-bd582dbb2a71-kube-api-access-b7nhv" (OuterVolumeSpecName: "kube-api-access-b7nhv") pod "cc128726-366c-4926-97f1-bd582dbb2a71" (UID: "cc128726-366c-4926-97f1-bd582dbb2a71"). InnerVolumeSpecName "kube-api-access-b7nhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.351109 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7nhv\" (UniqueName: \"kubernetes.io/projected/cc128726-366c-4926-97f1-bd582dbb2a71-kube-api-access-b7nhv\") on node \"crc\" DevicePath \"\"" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.393106 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc128726-366c-4926-97f1-bd582dbb2a71" (UID: "cc128726-366c-4926-97f1-bd582dbb2a71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.452177 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc128726-366c-4926-97f1-bd582dbb2a71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.623147 4749 generic.go:334] "Generic (PLEG): container finished" podID="cc128726-366c-4926-97f1-bd582dbb2a71" containerID="a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd" exitCode=0 Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.623260 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bbpn" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.626786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerDied","Data":"a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd"} Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.626825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bbpn" event={"ID":"cc128726-366c-4926-97f1-bd582dbb2a71","Type":"ContainerDied","Data":"7fbde67eff35b49c22eff570b5171db1e3acf79ac47b448efa83d0651da4173c"} Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.626848 4749 scope.go:117] "RemoveContainer" containerID="a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.662695 4749 scope.go:117] "RemoveContainer" containerID="40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.721154 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bbpn"] Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.751613 4749 scope.go:117] "RemoveContainer" containerID="abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.826679 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8bbpn"] Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.843615 4749 scope.go:117] "RemoveContainer" containerID="a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd" Mar 10 17:27:55 crc kubenswrapper[4749]: E0310 17:27:55.844311 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd\": container with ID starting with a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd not found: ID does not exist" containerID="a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.844353 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd"} err="failed to get container status \"a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd\": rpc error: code = NotFound desc = could not find container \"a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd\": container with ID starting with a955f4c646120fd80b9b6235cd58aed298e07e6eb4cc4a29876d50a411f70acd not found: ID does not exist" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.844454 4749 scope.go:117] "RemoveContainer" containerID="40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f" Mar 10 17:27:55 crc kubenswrapper[4749]: E0310 17:27:55.846003 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f\": container with ID starting with 40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f not found: ID does not exist" containerID="40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.846065 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f"} err="failed to get container status \"40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f\": rpc error: code = NotFound desc = could not find container \"40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f\": container with ID starting with 40e575118c0f8dfe700cff95550c22d199192a5d6f12d1f1a245138b3114046f not found: ID does not exist" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.846105 4749 scope.go:117] "RemoveContainer" containerID="abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5" Mar 10 17:27:55 crc kubenswrapper[4749]: E0310 17:27:55.850580 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5\": container with ID starting with abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5 not found: ID does not exist" containerID="abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5" Mar 10 17:27:55 crc kubenswrapper[4749]: I0310 17:27:55.850633 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5"} err="failed to get container status \"abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5\": rpc error: code = NotFound desc = could not find container \"abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5\": container with ID starting with abb21df2bb96708cb79790214a0b364c419343cbdc986e6aa0a72b1d26b13aa5 not found: ID does not exist" Mar 10 17:27:57 crc kubenswrapper[4749]: I0310 17:27:57.623076 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" path="/var/lib/kubelet/pods/cc128726-366c-4926-97f1-bd582dbb2a71/volumes" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.166816 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552728-jw7rn"] Mar 10 17:28:00 crc kubenswrapper[4749]: E0310 17:28:00.168040 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="extract-content" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.168058 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="extract-content" Mar 10 17:28:00 crc kubenswrapper[4749]: E0310 17:28:00.168100 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="registry-server" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.168108 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="registry-server" Mar 10 17:28:00 crc kubenswrapper[4749]: E0310 17:28:00.168125 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="extract-utilities" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.168134 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="extract-utilities" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.168359 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc128726-366c-4926-97f1-bd582dbb2a71" containerName="registry-server" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.169285 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.172483 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.172866 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.173333 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.187691 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552728-jw7rn"] Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.368468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glttn\" (UniqueName: \"kubernetes.io/projected/05dc34f1-6739-4a1e-9d6d-a90fa6747585-kube-api-access-glttn\") pod \"auto-csr-approver-29552728-jw7rn\" (UID: \"05dc34f1-6739-4a1e-9d6d-a90fa6747585\") " pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.470985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glttn\" (UniqueName: \"kubernetes.io/projected/05dc34f1-6739-4a1e-9d6d-a90fa6747585-kube-api-access-glttn\") pod \"auto-csr-approver-29552728-jw7rn\" (UID: \"05dc34f1-6739-4a1e-9d6d-a90fa6747585\") " pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.502623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glttn\" (UniqueName: \"kubernetes.io/projected/05dc34f1-6739-4a1e-9d6d-a90fa6747585-kube-api-access-glttn\") pod \"auto-csr-approver-29552728-jw7rn\" (UID: \"05dc34f1-6739-4a1e-9d6d-a90fa6747585\") " pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.503025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:00 crc kubenswrapper[4749]: I0310 17:28:00.812031 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552728-jw7rn"] Mar 10 17:28:01 crc kubenswrapper[4749]: I0310 17:28:01.693958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" event={"ID":"05dc34f1-6739-4a1e-9d6d-a90fa6747585","Type":"ContainerStarted","Data":"0836141d0c0ca67eb2b920e0f501b158f301edb465c3de5369951b01217bcd83"} Mar 10 17:28:02 crc kubenswrapper[4749]: I0310 17:28:02.704297 4749 generic.go:334] "Generic (PLEG): container finished" podID="05dc34f1-6739-4a1e-9d6d-a90fa6747585" containerID="a2ffe19d33de4cdd098978a477e5d893b5455a3977970e01196a13b58163aa08" exitCode=0 Mar 10 17:28:02 crc kubenswrapper[4749]: I0310 17:28:02.704712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" event={"ID":"05dc34f1-6739-4a1e-9d6d-a90fa6747585","Type":"ContainerDied","Data":"a2ffe19d33de4cdd098978a477e5d893b5455a3977970e01196a13b58163aa08"} Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.006095 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.134205 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glttn\" (UniqueName: \"kubernetes.io/projected/05dc34f1-6739-4a1e-9d6d-a90fa6747585-kube-api-access-glttn\") pod \"05dc34f1-6739-4a1e-9d6d-a90fa6747585\" (UID: \"05dc34f1-6739-4a1e-9d6d-a90fa6747585\") " Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.139271 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05dc34f1-6739-4a1e-9d6d-a90fa6747585-kube-api-access-glttn" (OuterVolumeSpecName: "kube-api-access-glttn") pod "05dc34f1-6739-4a1e-9d6d-a90fa6747585" (UID: "05dc34f1-6739-4a1e-9d6d-a90fa6747585"). InnerVolumeSpecName "kube-api-access-glttn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.236983 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glttn\" (UniqueName: \"kubernetes.io/projected/05dc34f1-6739-4a1e-9d6d-a90fa6747585-kube-api-access-glttn\") on node \"crc\" DevicePath \"\"" Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.731691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" event={"ID":"05dc34f1-6739-4a1e-9d6d-a90fa6747585","Type":"ContainerDied","Data":"0836141d0c0ca67eb2b920e0f501b158f301edb465c3de5369951b01217bcd83"} Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.731762 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0836141d0c0ca67eb2b920e0f501b158f301edb465c3de5369951b01217bcd83" Mar 10 17:28:04 crc kubenswrapper[4749]: I0310 17:28:04.731784 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552728-jw7rn" Mar 10 17:28:05 crc kubenswrapper[4749]: I0310 17:28:05.119173 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552722-qcwsm"] Mar 10 17:28:05 crc kubenswrapper[4749]: I0310 17:28:05.120974 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552722-qcwsm"] Mar 10 17:28:05 crc kubenswrapper[4749]: I0310 17:28:05.621133 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc87ef98-b6a1-48d2-87c8-cb79228e6ecb" path="/var/lib/kubelet/pods/fc87ef98-b6a1-48d2-87c8-cb79228e6ecb/volumes" Mar 10 17:28:28 crc kubenswrapper[4749]: I0310 17:28:28.171034 4749 scope.go:117] "RemoveContainer" containerID="18518adbd38e16abc8900975975a5ba227333c3d4513e034aa328c32fdf5502a" Mar 10 17:28:33 crc kubenswrapper[4749]: I0310 17:28:33.073627 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8fd6-account-create-update-pgkg2"] Mar 10 17:28:33 crc kubenswrapper[4749]: I0310 17:28:33.082234 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-75qt2"] Mar 10 17:28:33 crc kubenswrapper[4749]: I0310 17:28:33.089256 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-75qt2"] Mar 10 17:28:33 crc kubenswrapper[4749]: I0310 17:28:33.098553 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8fd6-account-create-update-pgkg2"] Mar 10 17:28:33 crc kubenswrapper[4749]: I0310 17:28:33.622627 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ab8a41-5c56-4242-b3e8-939c13843785" path="/var/lib/kubelet/pods/50ab8a41-5c56-4242-b3e8-939c13843785/volumes" Mar 10 17:28:33 crc kubenswrapper[4749]: I0310 17:28:33.623511 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c12163-f467-460b-b482-1af14ef0c774" path="/var/lib/kubelet/pods/70c12163-f467-460b-b482-1af14ef0c774/volumes" Mar 10 17:28:40 crc kubenswrapper[4749]: I0310 17:28:40.043777 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cdn5f"] Mar 10 17:28:40 crc kubenswrapper[4749]: I0310 17:28:40.055319 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cdn5f"] Mar 10 17:28:41 crc kubenswrapper[4749]: I0310 17:28:41.618857 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8658b95-e5ac-4c1f-a48c-1ec864a9df1d" path="/var/lib/kubelet/pods/f8658b95-e5ac-4c1f-a48c-1ec864a9df1d/volumes" Mar 10 17:28:50 crc kubenswrapper[4749]: I0310 17:28:50.981000 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:28:50 crc kubenswrapper[4749]: I0310 17:28:50.981575 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:28:53 crc kubenswrapper[4749]: I0310 17:28:53.052077 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jfqpd"] Mar 10 17:28:53 crc kubenswrapper[4749]: I0310 17:28:53.063710 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jfqpd"] Mar 10 17:28:53 crc kubenswrapper[4749]: I0310 17:28:53.617291 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249a5269-fc54-4bee-98a8-de5b4bad612b" path="/var/lib/kubelet/pods/249a5269-fc54-4bee-98a8-de5b4bad612b/volumes" Mar 10 17:29:20 crc kubenswrapper[4749]: I0310 17:29:20.980759 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:29:20 crc kubenswrapper[4749]: I0310 17:29:20.981522 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:29:28 crc kubenswrapper[4749]: I0310 17:29:28.272819 4749 scope.go:117] "RemoveContainer" containerID="c850ebc80ad511fe6a829e00eb26120d0b1d70681267915b1f77ce8468fafc5d" Mar 10 17:29:28 crc kubenswrapper[4749]: I0310 17:29:28.316070 4749 scope.go:117] "RemoveContainer" containerID="2def575f5f1e5eb4ed9cba46f7132d042b48cef146c8cb817a22b82039c92e4a" Mar 10 17:29:28 crc kubenswrapper[4749]: I0310 17:29:28.372504 4749 scope.go:117] "RemoveContainer" containerID="763b57c76dbfb7c1c2e1347871a97f851eeaf031773277649124dbc8438ca660" Mar 10 17:29:28 crc kubenswrapper[4749]: I0310 17:29:28.442489 4749 scope.go:117] "RemoveContainer" containerID="80c110d995cfc5810c456e45f1acb114cb472ef51e1e84bdc7a71705eb667ff9" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.022883 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ttr22"] Mar 10 17:29:31 crc kubenswrapper[4749]: E0310 17:29:31.023768 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dc34f1-6739-4a1e-9d6d-a90fa6747585" containerName="oc" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.023781 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dc34f1-6739-4a1e-9d6d-a90fa6747585" containerName="oc" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.023955 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dc34f1-6739-4a1e-9d6d-a90fa6747585" containerName="oc" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.025168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.031255 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttr22"] Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.104862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-utilities\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.105068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7bfm\" (UniqueName: \"kubernetes.io/projected/0dc9f8f4-cabd-471f-894d-a9a3714954f2-kube-api-access-t7bfm\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.105467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-catalog-content\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.207246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-catalog-content\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.207403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-utilities\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.207479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7bfm\" (UniqueName: \"kubernetes.io/projected/0dc9f8f4-cabd-471f-894d-a9a3714954f2-kube-api-access-t7bfm\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.207798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-catalog-content\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.208099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-utilities\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.231503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7bfm\" (UniqueName: \"kubernetes.io/projected/0dc9f8f4-cabd-471f-894d-a9a3714954f2-kube-api-access-t7bfm\") pod \"redhat-marketplace-ttr22\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.350478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:31 crc kubenswrapper[4749]: I0310 17:29:31.776058 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttr22"] Mar 10 17:29:32 crc kubenswrapper[4749]: I0310 17:29:32.504445 4749 generic.go:334] "Generic (PLEG): container finished" podID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerID="90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9" exitCode=0 Mar 10 17:29:32 crc kubenswrapper[4749]: I0310 17:29:32.504519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttr22" event={"ID":"0dc9f8f4-cabd-471f-894d-a9a3714954f2","Type":"ContainerDied","Data":"90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9"} Mar 10 17:29:32 crc kubenswrapper[4749]: I0310 17:29:32.504822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttr22" event={"ID":"0dc9f8f4-cabd-471f-894d-a9a3714954f2","Type":"ContainerStarted","Data":"8b259fab494bcdf10dbedec3a15638a3a4bf3375fe445a150612d229e4926c93"} Mar 10 17:29:32 crc kubenswrapper[4749]: I0310 17:29:32.507584 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:29:33 crc kubenswrapper[4749]: I0310 17:29:33.519302 4749 generic.go:334] "Generic (PLEG): container finished" podID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerID="5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1" exitCode=0 Mar 10 17:29:33 crc kubenswrapper[4749]: I0310 17:29:33.519411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttr22" event={"ID":"0dc9f8f4-cabd-471f-894d-a9a3714954f2","Type":"ContainerDied","Data":"5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1"} Mar 10 17:29:34 crc kubenswrapper[4749]: I0310 17:29:34.532301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttr22" event={"ID":"0dc9f8f4-cabd-471f-894d-a9a3714954f2","Type":"ContainerStarted","Data":"81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76"} Mar 10 17:29:34 crc kubenswrapper[4749]: I0310 17:29:34.557590 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ttr22" podStartSLOduration=3.152456495 podStartE2EDuration="4.55756186s" podCreationTimestamp="2026-03-10 17:29:30 +0000 UTC" firstStartedPulling="2026-03-10 17:29:32.507252388 +0000 UTC m=+6069.629118085" lastFinishedPulling="2026-03-10 17:29:33.912357763 +0000 UTC m=+6071.034223450" observedRunningTime="2026-03-10 17:29:34.552627936 +0000 UTC m=+6071.674493623" watchObservedRunningTime="2026-03-10 17:29:34.55756186 +0000 UTC m=+6071.679427567" Mar 10 17:29:41 crc kubenswrapper[4749]: I0310 17:29:41.351003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:41 crc kubenswrapper[4749]: I0310 17:29:41.351553 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:41 crc kubenswrapper[4749]: I0310 17:29:41.398199 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:41 crc kubenswrapper[4749]: I0310 17:29:41.679644 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:41 crc kubenswrapper[4749]: I0310 17:29:41.734236 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttr22"] Mar 10 17:29:43 crc kubenswrapper[4749]: I0310 17:29:43.617614 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ttr22" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="registry-server" containerID="cri-o://81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76" gracePeriod=2 Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.029886 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.040578 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-catalog-content\") pod \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.040677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-utilities\") pod \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.040744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7bfm\" (UniqueName: \"kubernetes.io/projected/0dc9f8f4-cabd-471f-894d-a9a3714954f2-kube-api-access-t7bfm\") pod \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\" (UID: \"0dc9f8f4-cabd-471f-894d-a9a3714954f2\") " Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.041494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-utilities" (OuterVolumeSpecName: "utilities") pod "0dc9f8f4-cabd-471f-894d-a9a3714954f2" (UID: "0dc9f8f4-cabd-471f-894d-a9a3714954f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.047157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc9f8f4-cabd-471f-894d-a9a3714954f2-kube-api-access-t7bfm" (OuterVolumeSpecName: "kube-api-access-t7bfm") pod "0dc9f8f4-cabd-471f-894d-a9a3714954f2" (UID: "0dc9f8f4-cabd-471f-894d-a9a3714954f2"). InnerVolumeSpecName "kube-api-access-t7bfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.142135 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.142171 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7bfm\" (UniqueName: \"kubernetes.io/projected/0dc9f8f4-cabd-471f-894d-a9a3714954f2-kube-api-access-t7bfm\") on node \"crc\" DevicePath \"\"" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.157198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dc9f8f4-cabd-471f-894d-a9a3714954f2" (UID: "0dc9f8f4-cabd-471f-894d-a9a3714954f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.244075 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dc9f8f4-cabd-471f-894d-a9a3714954f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.627863 4749 generic.go:334] "Generic (PLEG): container finished" podID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerID="81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76" exitCode=0 Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.628487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttr22" event={"ID":"0dc9f8f4-cabd-471f-894d-a9a3714954f2","Type":"ContainerDied","Data":"81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76"} Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.628571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ttr22" event={"ID":"0dc9f8f4-cabd-471f-894d-a9a3714954f2","Type":"ContainerDied","Data":"8b259fab494bcdf10dbedec3a15638a3a4bf3375fe445a150612d229e4926c93"} Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.628665 4749 scope.go:117] "RemoveContainer" containerID="81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.628849 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ttr22" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.660782 4749 scope.go:117] "RemoveContainer" containerID="5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.665547 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttr22"] Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.674313 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ttr22"] Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.684596 4749 scope.go:117] "RemoveContainer" containerID="90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.722253 4749 scope.go:117] "RemoveContainer" containerID="81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76" Mar 10 17:29:44 crc kubenswrapper[4749]: E0310 17:29:44.722777 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76\": container with ID starting with 81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76 not found: ID does not exist" containerID="81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.722806 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76"} err="failed to get container status \"81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76\": rpc error: code = NotFound desc = could not find container \"81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76\": container with ID starting with 81b13a63c84c87e443dc727ffee5d62d73bb66038ee2bbfeea973c4a06a3ce76 not found: ID does not exist" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.722840 4749 scope.go:117] "RemoveContainer" containerID="5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1" Mar 10 17:29:44 crc kubenswrapper[4749]: E0310 17:29:44.723072 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1\": container with ID starting with 5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1 not found: ID does not exist" containerID="5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.723096 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1"} err="failed to get container status \"5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1\": rpc error: code = NotFound desc = could not find container \"5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1\": container with ID starting with 5011e49e2374bf9f3448208c9303b41ac15909e9ae5afc20967df5576c583ed1 not found: ID does not exist" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.723110 4749 scope.go:117] "RemoveContainer" containerID="90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9" Mar 10 17:29:44 crc kubenswrapper[4749]: E0310 17:29:44.723340 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9\": container with ID starting with 90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9 not found: ID does not exist" containerID="90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9" Mar 10 17:29:44 crc kubenswrapper[4749]: I0310 17:29:44.723360 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9"} err="failed to get container status \"90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9\": rpc error: code = NotFound desc = could not find container \"90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9\": container with ID starting with 90a3e4095236c8204c539346780f111b2d493b14030b9a387f22d9d3f828cab9 not found: ID does not exist" Mar 10 17:29:45 crc kubenswrapper[4749]: I0310 17:29:45.618433 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" path="/var/lib/kubelet/pods/0dc9f8f4-cabd-471f-894d-a9a3714954f2/volumes" Mar 10 17:29:50 crc kubenswrapper[4749]: I0310 17:29:50.980930 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:29:50 crc kubenswrapper[4749]: I0310 17:29:50.981671 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:29:50 crc kubenswrapper[4749]: I0310 17:29:50.981737 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:29:50 crc kubenswrapper[4749]: I0310 17:29:50.982615 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:29:50 crc kubenswrapper[4749]: I0310 17:29:50.982695 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" gracePeriod=600 Mar 10 17:29:51 crc kubenswrapper[4749]: E0310 17:29:51.110697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:29:51 crc kubenswrapper[4749]: I0310 17:29:51.698558 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" exitCode=0 Mar 10 17:29:51 crc kubenswrapper[4749]: I0310 17:29:51.698702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87"} Mar 10 17:29:51 crc kubenswrapper[4749]: I0310 17:29:51.699056 4749 scope.go:117] "RemoveContainer" containerID="2d05271c71f4359f91a631f79b2084ee3a82bd748331985a2adec121e2772060" Mar 10 17:29:51 crc kubenswrapper[4749]: I0310 17:29:51.699583 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:29:51 crc kubenswrapper[4749]: E0310 17:29:51.699835 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.165438 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552730-rj2sv"] Mar 10 17:30:00 crc kubenswrapper[4749]: E0310 17:30:00.166345 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="registry-server" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.166362 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="registry-server" Mar 10 17:30:00 crc kubenswrapper[4749]: E0310 17:30:00.166418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="extract-content" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.166427 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="extract-content" Mar 10 17:30:00 crc kubenswrapper[4749]: E0310 17:30:00.166446 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="extract-utilities" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.166455 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="extract-utilities" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.166657 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc9f8f4-cabd-471f-894d-a9a3714954f2" containerName="registry-server" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.167360 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.174685 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.174733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.175015 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.183529 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d"] Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.185892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.191422 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.191697 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.214737 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552730-rj2sv"] Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.222553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d"] Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.240276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-config-volume\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.240356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbqnq\" (UniqueName: \"kubernetes.io/projected/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-kube-api-access-fbqnq\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.240429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-secret-volume\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.240469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8sbj\" (UniqueName: \"kubernetes.io/projected/14988633-5222-4217-9bc5-21260023616c-kube-api-access-t8sbj\") pod \"auto-csr-approver-29552730-rj2sv\" (UID: \"14988633-5222-4217-9bc5-21260023616c\") " pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.342585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-config-volume\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.343532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-config-volume\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.343669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbqnq\" (UniqueName: \"kubernetes.io/projected/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-kube-api-access-fbqnq\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.343701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-secret-volume\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.344042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8sbj\" (UniqueName: \"kubernetes.io/projected/14988633-5222-4217-9bc5-21260023616c-kube-api-access-t8sbj\") pod \"auto-csr-approver-29552730-rj2sv\" (UID: \"14988633-5222-4217-9bc5-21260023616c\") " pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.350002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-secret-volume\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.363576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbqnq\" (UniqueName: \"kubernetes.io/projected/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-kube-api-access-fbqnq\") pod \"collect-profiles-29552730-6jb2d\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.363783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8sbj\" (UniqueName: \"kubernetes.io/projected/14988633-5222-4217-9bc5-21260023616c-kube-api-access-t8sbj\") pod \"auto-csr-approver-29552730-rj2sv\" (UID: \"14988633-5222-4217-9bc5-21260023616c\") " pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.490216 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.510703 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:00 crc kubenswrapper[4749]: I0310 17:30:00.954120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552730-rj2sv"] Mar 10 17:30:00 crc kubenswrapper[4749]: W0310 17:30:00.956593 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14988633_5222_4217_9bc5_21260023616c.slice/crio-eb7ad0885f923bea720ab03b9a50be5d173f7e68ab6dc50d22dfdf5d3b59510c WatchSource:0}: Error finding container eb7ad0885f923bea720ab03b9a50be5d173f7e68ab6dc50d22dfdf5d3b59510c: Status 404 returned error can't find the container with id eb7ad0885f923bea720ab03b9a50be5d173f7e68ab6dc50d22dfdf5d3b59510c Mar 10 17:30:01 crc kubenswrapper[4749]: I0310 17:30:01.016407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d"] Mar 10 17:30:01 crc kubenswrapper[4749]: W0310 17:30:01.023276 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c4d1215_3ba5_4a81_8942_ea6e07eaee45.slice/crio-c574d00178378fae62792d82604bcbe3b2ddb3a7b57024e82c3392042ae1caed WatchSource:0}: Error finding container c574d00178378fae62792d82604bcbe3b2ddb3a7b57024e82c3392042ae1caed: Status 404 returned error can't find the container with id c574d00178378fae62792d82604bcbe3b2ddb3a7b57024e82c3392042ae1caed Mar 10 17:30:01 crc kubenswrapper[4749]: I0310 17:30:01.783287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" event={"ID":"14988633-5222-4217-9bc5-21260023616c","Type":"ContainerStarted","Data":"eb7ad0885f923bea720ab03b9a50be5d173f7e68ab6dc50d22dfdf5d3b59510c"} Mar 10 17:30:01 crc kubenswrapper[4749]: I0310 17:30:01.785322 4749 generic.go:334] "Generic (PLEG): container finished" podID="7c4d1215-3ba5-4a81-8942-ea6e07eaee45" containerID="e11cb9cb9a26e96e10b78559bb9145441f585a6260477c2ed53bac42040ecb68" exitCode=0 Mar 10 17:30:01 crc kubenswrapper[4749]: I0310 17:30:01.785429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" event={"ID":"7c4d1215-3ba5-4a81-8942-ea6e07eaee45","Type":"ContainerDied","Data":"e11cb9cb9a26e96e10b78559bb9145441f585a6260477c2ed53bac42040ecb68"} Mar 10 17:30:01 crc kubenswrapper[4749]: I0310 17:30:01.785461 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" event={"ID":"7c4d1215-3ba5-4a81-8942-ea6e07eaee45","Type":"ContainerStarted","Data":"c574d00178378fae62792d82604bcbe3b2ddb3a7b57024e82c3392042ae1caed"} Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.092658 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.187087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbqnq\" (UniqueName: \"kubernetes.io/projected/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-kube-api-access-fbqnq\") pod \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.187128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-secret-volume\") pod \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.187208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-config-volume\") pod \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\" (UID: \"7c4d1215-3ba5-4a81-8942-ea6e07eaee45\") " Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.188354 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c4d1215-3ba5-4a81-8942-ea6e07eaee45" (UID: "7c4d1215-3ba5-4a81-8942-ea6e07eaee45"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.193164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-kube-api-access-fbqnq" (OuterVolumeSpecName: "kube-api-access-fbqnq") pod "7c4d1215-3ba5-4a81-8942-ea6e07eaee45" (UID: "7c4d1215-3ba5-4a81-8942-ea6e07eaee45"). InnerVolumeSpecName "kube-api-access-fbqnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.193426 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c4d1215-3ba5-4a81-8942-ea6e07eaee45" (UID: "7c4d1215-3ba5-4a81-8942-ea6e07eaee45"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.288820 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbqnq\" (UniqueName: \"kubernetes.io/projected/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-kube-api-access-fbqnq\") on node \"crc\" DevicePath \"\"" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.288897 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.288918 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c4d1215-3ba5-4a81-8942-ea6e07eaee45-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.802789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" event={"ID":"7c4d1215-3ba5-4a81-8942-ea6e07eaee45","Type":"ContainerDied","Data":"c574d00178378fae62792d82604bcbe3b2ddb3a7b57024e82c3392042ae1caed"} Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.803568 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c574d00178378fae62792d82604bcbe3b2ddb3a7b57024e82c3392042ae1caed" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.802838 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552730-6jb2d" Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.806912 4749 generic.go:334] "Generic (PLEG): container finished" podID="14988633-5222-4217-9bc5-21260023616c" containerID="c123746721d70ea9b28a0786d86bf7608ed262a00d37e1ca22736ea7f76c3a6f" exitCode=0 Mar 10 17:30:03 crc kubenswrapper[4749]: I0310 17:30:03.806966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" event={"ID":"14988633-5222-4217-9bc5-21260023616c","Type":"ContainerDied","Data":"c123746721d70ea9b28a0786d86bf7608ed262a00d37e1ca22736ea7f76c3a6f"} Mar 10 17:30:04 crc kubenswrapper[4749]: I0310 17:30:04.171692 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx"] Mar 10 17:30:04 crc kubenswrapper[4749]: I0310 17:30:04.180151 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552685-kgjgx"] Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.181757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.325938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8sbj\" (UniqueName: \"kubernetes.io/projected/14988633-5222-4217-9bc5-21260023616c-kube-api-access-t8sbj\") pod \"14988633-5222-4217-9bc5-21260023616c\" (UID: \"14988633-5222-4217-9bc5-21260023616c\") " Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.341647 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14988633-5222-4217-9bc5-21260023616c-kube-api-access-t8sbj" (OuterVolumeSpecName: "kube-api-access-t8sbj") pod "14988633-5222-4217-9bc5-21260023616c" (UID: "14988633-5222-4217-9bc5-21260023616c"). InnerVolumeSpecName "kube-api-access-t8sbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.430215 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8sbj\" (UniqueName: \"kubernetes.io/projected/14988633-5222-4217-9bc5-21260023616c-kube-api-access-t8sbj\") on node \"crc\" DevicePath \"\"" Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.624958 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3745e088-a556-46c2-90cf-468698371ccf" path="/var/lib/kubelet/pods/3745e088-a556-46c2-90cf-468698371ccf/volumes" Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.827624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" event={"ID":"14988633-5222-4217-9bc5-21260023616c","Type":"ContainerDied","Data":"eb7ad0885f923bea720ab03b9a50be5d173f7e68ab6dc50d22dfdf5d3b59510c"} Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.827667 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb7ad0885f923bea720ab03b9a50be5d173f7e68ab6dc50d22dfdf5d3b59510c" Mar 10 17:30:05 crc kubenswrapper[4749]: I0310 17:30:05.828093 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552730-rj2sv" Mar 10 17:30:06 crc kubenswrapper[4749]: I0310 17:30:06.242004 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552724-8wvpn"] Mar 10 17:30:06 crc kubenswrapper[4749]: I0310 17:30:06.248252 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552724-8wvpn"] Mar 10 17:30:06 crc kubenswrapper[4749]: I0310 17:30:06.606338 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:30:06 crc kubenswrapper[4749]: E0310 17:30:06.606916 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:30:07 crc kubenswrapper[4749]: I0310 17:30:07.627844 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989a1514-fe33-4892-9ad4-d9d471fe007c" path="/var/lib/kubelet/pods/989a1514-fe33-4892-9ad4-d9d471fe007c/volumes" Mar 10 17:30:20 crc kubenswrapper[4749]: I0310 17:30:20.606468 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:30:20 crc kubenswrapper[4749]: E0310 17:30:20.607260 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:30:28 crc kubenswrapper[4749]: I0310 17:30:28.541550 4749 scope.go:117] "RemoveContainer" containerID="1cc09adc0ba0684fe95206d91b748578f4bc68cc4da4515669c1d91569420ba6" Mar 10 17:30:28 crc kubenswrapper[4749]: I0310 17:30:28.568450 4749 scope.go:117] "RemoveContainer" containerID="72c26045a87be0ede557dd4d3f87f986512799cc8c615cc6799f1a6fc6fd985e" Mar 10 17:30:35 crc kubenswrapper[4749]: I0310 17:30:35.607320 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:30:35 crc kubenswrapper[4749]: E0310 17:30:35.608594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:30:46 crc kubenswrapper[4749]: I0310 17:30:46.607082 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:30:46 crc kubenswrapper[4749]: E0310 17:30:46.607623 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:31:00 crc kubenswrapper[4749]: I0310 17:31:00.606594 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:31:00 crc kubenswrapper[4749]: E0310 17:31:00.607427 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:31:12 crc kubenswrapper[4749]: I0310 17:31:12.606625 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:31:12 crc kubenswrapper[4749]: E0310 17:31:12.607873 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:31:24 crc kubenswrapper[4749]: I0310 17:31:24.607015 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:31:24 crc kubenswrapper[4749]: E0310 17:31:24.607849 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:31:36 crc kubenswrapper[4749]: I0310 17:31:36.606789 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:31:36 crc kubenswrapper[4749]: E0310 17:31:36.607348 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:31:50 crc kubenswrapper[4749]: I0310 17:31:50.607891 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:31:50 crc kubenswrapper[4749]: E0310 17:31:50.609221 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.188469 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552732-2mt6x"] Mar 10 17:32:00 crc kubenswrapper[4749]: E0310 17:32:00.189406 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14988633-5222-4217-9bc5-21260023616c" containerName="oc" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.189423 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="14988633-5222-4217-9bc5-21260023616c" containerName="oc" Mar 10 17:32:00 crc kubenswrapper[4749]: E0310 17:32:00.189444 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4d1215-3ba5-4a81-8942-ea6e07eaee45" containerName="collect-profiles" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.189453 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4d1215-3ba5-4a81-8942-ea6e07eaee45" containerName="collect-profiles" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.189636 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="14988633-5222-4217-9bc5-21260023616c" containerName="oc" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.189664 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c4d1215-3ba5-4a81-8942-ea6e07eaee45" containerName="collect-profiles" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.190277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.193352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.193557 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.195945 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.214112 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552732-2mt6x"] Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.364773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qn7k\" (UniqueName: \"kubernetes.io/projected/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d-kube-api-access-2qn7k\") pod \"auto-csr-approver-29552732-2mt6x\" (UID: \"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d\") " pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.466649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qn7k\" (UniqueName: \"kubernetes.io/projected/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d-kube-api-access-2qn7k\") pod \"auto-csr-approver-29552732-2mt6x\" (UID: \"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d\") " pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.495010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qn7k\" (UniqueName: \"kubernetes.io/projected/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d-kube-api-access-2qn7k\") pod \"auto-csr-approver-29552732-2mt6x\" (UID: \"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d\") " pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.510354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.921107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552732-2mt6x"] Mar 10 17:32:00 crc kubenswrapper[4749]: I0310 17:32:00.998094 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" event={"ID":"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d","Type":"ContainerStarted","Data":"009faa80d0050f6c25f8ab568b49369d04e5c7ebff94801b18714882a11dcf87"} Mar 10 17:32:01 crc kubenswrapper[4749]: I0310 17:32:01.607451 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:32:01 crc kubenswrapper[4749]: E0310 17:32:01.607690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:32:03 crc kubenswrapper[4749]: I0310 17:32:03.014249 4749 generic.go:334] "Generic (PLEG): container finished" podID="41a3a4c1-f676-48c1-8613-6ea3a6b8be8d" containerID="79c8c4b936467926790020d942e1fd1a67001a256d9b78c5fd7a4762b01ff7de" exitCode=0 Mar 10 17:32:03 crc kubenswrapper[4749]: I0310 17:32:03.014302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" event={"ID":"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d","Type":"ContainerDied","Data":"79c8c4b936467926790020d942e1fd1a67001a256d9b78c5fd7a4762b01ff7de"} Mar 10 17:32:04 crc kubenswrapper[4749]: I0310 17:32:04.333706 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:04 crc kubenswrapper[4749]: I0310 17:32:04.432393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qn7k\" (UniqueName: \"kubernetes.io/projected/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d-kube-api-access-2qn7k\") pod \"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d\" (UID: \"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d\") " Mar 10 17:32:04 crc kubenswrapper[4749]: I0310 17:32:04.439638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d-kube-api-access-2qn7k" (OuterVolumeSpecName: "kube-api-access-2qn7k") pod "41a3a4c1-f676-48c1-8613-6ea3a6b8be8d" (UID: "41a3a4c1-f676-48c1-8613-6ea3a6b8be8d"). InnerVolumeSpecName "kube-api-access-2qn7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:32:04 crc kubenswrapper[4749]: I0310 17:32:04.534736 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qn7k\" (UniqueName: \"kubernetes.io/projected/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d-kube-api-access-2qn7k\") on node \"crc\" DevicePath \"\"" Mar 10 17:32:05 crc kubenswrapper[4749]: I0310 17:32:05.032789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" event={"ID":"41a3a4c1-f676-48c1-8613-6ea3a6b8be8d","Type":"ContainerDied","Data":"009faa80d0050f6c25f8ab568b49369d04e5c7ebff94801b18714882a11dcf87"} Mar 10 17:32:05 crc kubenswrapper[4749]: I0310 17:32:05.032848 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009faa80d0050f6c25f8ab568b49369d04e5c7ebff94801b18714882a11dcf87" Mar 10 17:32:05 crc kubenswrapper[4749]: I0310 17:32:05.032855 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552732-2mt6x" Mar 10 17:32:05 crc kubenswrapper[4749]: I0310 17:32:05.399548 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552726-fgcrs"] Mar 10 17:32:05 crc kubenswrapper[4749]: I0310 17:32:05.410777 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552726-fgcrs"] Mar 10 17:32:05 crc kubenswrapper[4749]: I0310 17:32:05.621112 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c63e28-dfb1-47d3-9d15-9d7cda0161c8" path="/var/lib/kubelet/pods/a7c63e28-dfb1-47d3-9d15-9d7cda0161c8/volumes" Mar 10 17:32:15 crc kubenswrapper[4749]: I0310 17:32:15.606621 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:32:15 crc kubenswrapper[4749]: E0310 17:32:15.607429 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:32:27 crc kubenswrapper[4749]: I0310 17:32:27.606718 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:32:27 crc kubenswrapper[4749]: E0310 17:32:27.607677 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:32:28 crc kubenswrapper[4749]: I0310 17:32:28.692722 4749 scope.go:117] "RemoveContainer" containerID="21542492f0bcb0f1a60cedb66c2497821a0b2d45ca1af38ad2320a8f28ff908e" Mar 10 17:32:42 crc kubenswrapper[4749]: I0310 17:32:42.606627 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:32:42 crc kubenswrapper[4749]: E0310 17:32:42.607170 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:32:53 crc kubenswrapper[4749]: I0310 17:32:53.613366 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:32:53 crc kubenswrapper[4749]: E0310 17:32:53.614280 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:33:04 crc kubenswrapper[4749]: I0310 17:33:04.607530 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:33:04 crc kubenswrapper[4749]: E0310 17:33:04.608469 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:33:15 crc kubenswrapper[4749]: I0310 17:33:15.606740 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:33:15 crc kubenswrapper[4749]: E0310 17:33:15.607496 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:33:29 crc kubenswrapper[4749]: I0310 17:33:29.607114 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:33:29 crc kubenswrapper[4749]: E0310 17:33:29.607814 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:33:40 crc kubenswrapper[4749]: I0310 17:33:40.607539 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:33:40 crc kubenswrapper[4749]: E0310 17:33:40.608264 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:33:55 crc kubenswrapper[4749]: I0310 17:33:55.606756 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:33:55 crc kubenswrapper[4749]: E0310 17:33:55.607682 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.169516 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552734-bkqrn"] Mar 10 17:34:00 crc kubenswrapper[4749]: E0310 17:34:00.171238 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a3a4c1-f676-48c1-8613-6ea3a6b8be8d" containerName="oc" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.171279 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a3a4c1-f676-48c1-8613-6ea3a6b8be8d" containerName="oc" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.171574 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a3a4c1-f676-48c1-8613-6ea3a6b8be8d" containerName="oc" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.172425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.175617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.175829 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.176474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.177063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552734-bkqrn"] Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.206848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bddl2\" (UniqueName: \"kubernetes.io/projected/5d38ec74-2164-42f9-bb6b-046acc6dec25-kube-api-access-bddl2\") pod \"auto-csr-approver-29552734-bkqrn\" (UID: \"5d38ec74-2164-42f9-bb6b-046acc6dec25\") " pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.308522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bddl2\" (UniqueName: \"kubernetes.io/projected/5d38ec74-2164-42f9-bb6b-046acc6dec25-kube-api-access-bddl2\") pod \"auto-csr-approver-29552734-bkqrn\" (UID: \"5d38ec74-2164-42f9-bb6b-046acc6dec25\") " pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.335269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bddl2\" (UniqueName: \"kubernetes.io/projected/5d38ec74-2164-42f9-bb6b-046acc6dec25-kube-api-access-bddl2\") pod \"auto-csr-approver-29552734-bkqrn\" (UID: \"5d38ec74-2164-42f9-bb6b-046acc6dec25\") " pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.504234 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:00 crc kubenswrapper[4749]: I0310 17:34:00.774174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552734-bkqrn"] Mar 10 17:34:01 crc kubenswrapper[4749]: I0310 17:34:01.026846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" event={"ID":"5d38ec74-2164-42f9-bb6b-046acc6dec25","Type":"ContainerStarted","Data":"2cca2fee49a06c108c6927e23c48b2ba74fbe5b847b98faae5f8376d531c8717"} Mar 10 17:34:03 crc kubenswrapper[4749]: I0310 17:34:03.047593 4749 generic.go:334] "Generic (PLEG): container finished" podID="5d38ec74-2164-42f9-bb6b-046acc6dec25" containerID="98cab473538e43d5a81230e4b4ad80cdf3e428e8bc64853b68e5411981da21e3" exitCode=0 Mar 10 17:34:03 crc kubenswrapper[4749]: I0310 17:34:03.047652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" event={"ID":"5d38ec74-2164-42f9-bb6b-046acc6dec25","Type":"ContainerDied","Data":"98cab473538e43d5a81230e4b4ad80cdf3e428e8bc64853b68e5411981da21e3"} Mar 10 17:34:04 crc kubenswrapper[4749]: I0310 17:34:04.385386 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:04 crc kubenswrapper[4749]: I0310 17:34:04.484564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bddl2\" (UniqueName: \"kubernetes.io/projected/5d38ec74-2164-42f9-bb6b-046acc6dec25-kube-api-access-bddl2\") pod \"5d38ec74-2164-42f9-bb6b-046acc6dec25\" (UID: \"5d38ec74-2164-42f9-bb6b-046acc6dec25\") " Mar 10 17:34:04 crc kubenswrapper[4749]: I0310 17:34:04.490494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d38ec74-2164-42f9-bb6b-046acc6dec25-kube-api-access-bddl2" (OuterVolumeSpecName: "kube-api-access-bddl2") pod "5d38ec74-2164-42f9-bb6b-046acc6dec25" (UID: "5d38ec74-2164-42f9-bb6b-046acc6dec25"). InnerVolumeSpecName "kube-api-access-bddl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:34:04 crc kubenswrapper[4749]: I0310 17:34:04.586242 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bddl2\" (UniqueName: \"kubernetes.io/projected/5d38ec74-2164-42f9-bb6b-046acc6dec25-kube-api-access-bddl2\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:05 crc kubenswrapper[4749]: I0310 17:34:05.069114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" event={"ID":"5d38ec74-2164-42f9-bb6b-046acc6dec25","Type":"ContainerDied","Data":"2cca2fee49a06c108c6927e23c48b2ba74fbe5b847b98faae5f8376d531c8717"} Mar 10 17:34:05 crc kubenswrapper[4749]: I0310 17:34:05.069320 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cca2fee49a06c108c6927e23c48b2ba74fbe5b847b98faae5f8376d531c8717" Mar 10 17:34:05 crc kubenswrapper[4749]: I0310 17:34:05.069135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552734-bkqrn" Mar 10 17:34:05 crc kubenswrapper[4749]: I0310 17:34:05.448582 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552728-jw7rn"] Mar 10 17:34:05 crc kubenswrapper[4749]: I0310 17:34:05.454315 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552728-jw7rn"] Mar 10 17:34:05 crc kubenswrapper[4749]: I0310 17:34:05.617518 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05dc34f1-6739-4a1e-9d6d-a90fa6747585" path="/var/lib/kubelet/pods/05dc34f1-6739-4a1e-9d6d-a90fa6747585/volumes" Mar 10 17:34:07 crc kubenswrapper[4749]: I0310 17:34:07.606777 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:34:07 crc kubenswrapper[4749]: E0310 17:34:07.607145 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:34:21 crc kubenswrapper[4749]: I0310 17:34:21.607172 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:34:21 crc kubenswrapper[4749]: E0310 17:34:21.608027 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:34:28 crc kubenswrapper[4749]: I0310 17:34:28.805450 4749 scope.go:117] "RemoveContainer" containerID="a2ffe19d33de4cdd098978a477e5d893b5455a3977970e01196a13b58163aa08" Mar 10 17:34:34 crc kubenswrapper[4749]: I0310 17:34:34.607667 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:34:34 crc kubenswrapper[4749]: E0310 17:34:34.608499 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.494415 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7zs6p"] Mar 10 17:34:39 crc kubenswrapper[4749]: E0310 17:34:39.495454 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d38ec74-2164-42f9-bb6b-046acc6dec25" containerName="oc" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.495478 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d38ec74-2164-42f9-bb6b-046acc6dec25" containerName="oc" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.495734 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d38ec74-2164-42f9-bb6b-046acc6dec25" containerName="oc" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.497222 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.513650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zs6p"] Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.593495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-utilities\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.593541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fdbt\" (UniqueName: \"kubernetes.io/projected/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-kube-api-access-8fdbt\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.593690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-catalog-content\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.695416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-catalog-content\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.695910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-utilities\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.695936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fdbt\" (UniqueName: \"kubernetes.io/projected/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-kube-api-access-8fdbt\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.696220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-catalog-content\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.696571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-utilities\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.746095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fdbt\" (UniqueName: \"kubernetes.io/projected/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-kube-api-access-8fdbt\") pod \"community-operators-7zs6p\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:39 crc kubenswrapper[4749]: I0310 17:34:39.823180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:40 crc kubenswrapper[4749]: I0310 17:34:40.363695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zs6p"] Mar 10 17:34:41 crc kubenswrapper[4749]: I0310 17:34:41.386472 4749 generic.go:334] "Generic (PLEG): container finished" podID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerID="257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d" exitCode=0 Mar 10 17:34:41 crc kubenswrapper[4749]: I0310 17:34:41.386822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zs6p" event={"ID":"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5","Type":"ContainerDied","Data":"257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d"} Mar 10 17:34:41 crc kubenswrapper[4749]: I0310 17:34:41.386872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zs6p" event={"ID":"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5","Type":"ContainerStarted","Data":"1309595c92e64e348274b18bbabd30036c78112756e8b428ba14f6cb30bc5633"} Mar 10 17:34:41 crc kubenswrapper[4749]: I0310 17:34:41.389496 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.696762 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wfvxt"] Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.698891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.710513 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wfvxt"] Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.858758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-utilities\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.858823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-catalog-content\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.858875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzjm\" (UniqueName: \"kubernetes.io/projected/edd11bf6-6fc9-477b-a934-06359e31ca54-kube-api-access-rpzjm\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.960055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzjm\" (UniqueName: \"kubernetes.io/projected/edd11bf6-6fc9-477b-a934-06359e31ca54-kube-api-access-rpzjm\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.960247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-utilities\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.960290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-catalog-content\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.960816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-utilities\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.960897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-catalog-content\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:42 crc kubenswrapper[4749]: I0310 17:34:42.989033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzjm\" (UniqueName: \"kubernetes.io/projected/edd11bf6-6fc9-477b-a934-06359e31ca54-kube-api-access-rpzjm\") pod \"certified-operators-wfvxt\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:43 crc kubenswrapper[4749]: I0310 17:34:43.190492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:43 crc kubenswrapper[4749]: I0310 17:34:43.410487 4749 generic.go:334] "Generic (PLEG): container finished" podID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerID="f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443" exitCode=0 Mar 10 17:34:43 crc kubenswrapper[4749]: I0310 17:34:43.410810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zs6p" event={"ID":"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5","Type":"ContainerDied","Data":"f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443"} Mar 10 17:34:43 crc kubenswrapper[4749]: I0310 17:34:43.550989 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wfvxt"] Mar 10 17:34:43 crc kubenswrapper[4749]: W0310 17:34:43.557065 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd11bf6_6fc9_477b_a934_06359e31ca54.slice/crio-4d4822672da0795c5ba126e77b2552c619f2ba5f502f89eebe04dd207e802687 WatchSource:0}: Error finding container 4d4822672da0795c5ba126e77b2552c619f2ba5f502f89eebe04dd207e802687: Status 404 returned error can't find the container with id 4d4822672da0795c5ba126e77b2552c619f2ba5f502f89eebe04dd207e802687 Mar 10 17:34:44 crc kubenswrapper[4749]: I0310 17:34:44.422062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zs6p" event={"ID":"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5","Type":"ContainerStarted","Data":"ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace"} Mar 10 17:34:44 crc kubenswrapper[4749]: I0310 17:34:44.423569 4749 generic.go:334] "Generic (PLEG): container finished" podID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerID="1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281" exitCode=0 Mar 10 17:34:44 crc kubenswrapper[4749]: I0310 17:34:44.423601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerDied","Data":"1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281"} Mar 10 17:34:44 crc kubenswrapper[4749]: I0310 17:34:44.423621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerStarted","Data":"4d4822672da0795c5ba126e77b2552c619f2ba5f502f89eebe04dd207e802687"} Mar 10 17:34:44 crc kubenswrapper[4749]: I0310 17:34:44.446424 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7zs6p" podStartSLOduration=2.744372537 podStartE2EDuration="5.446405403s" podCreationTimestamp="2026-03-10 17:34:39 +0000 UTC" firstStartedPulling="2026-03-10 17:34:41.388988479 +0000 UTC m=+6378.510854206" lastFinishedPulling="2026-03-10 17:34:44.091021385 +0000 UTC m=+6381.212887072" observedRunningTime="2026-03-10 17:34:44.439711171 +0000 UTC m=+6381.561576868" watchObservedRunningTime="2026-03-10 17:34:44.446405403 +0000 UTC m=+6381.568271100" Mar 10 17:34:45 crc kubenswrapper[4749]: I0310 17:34:45.432863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerStarted","Data":"dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5"} Mar 10 17:34:46 crc kubenswrapper[4749]: I0310 17:34:46.444527 4749 generic.go:334] "Generic (PLEG): container finished" podID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerID="dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5" exitCode=0 Mar 10 17:34:46 crc kubenswrapper[4749]: I0310 17:34:46.444592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerDied","Data":"dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5"} Mar 10 17:34:47 crc kubenswrapper[4749]: I0310 17:34:47.456617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerStarted","Data":"1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3"} Mar 10 17:34:47 crc kubenswrapper[4749]: I0310 17:34:47.478550 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wfvxt" podStartSLOduration=3.057098064 podStartE2EDuration="5.47852645s" podCreationTimestamp="2026-03-10 17:34:42 +0000 UTC" firstStartedPulling="2026-03-10 17:34:44.424639103 +0000 UTC m=+6381.546504790" lastFinishedPulling="2026-03-10 17:34:46.846067489 +0000 UTC m=+6383.967933176" observedRunningTime="2026-03-10 17:34:47.473550555 +0000 UTC m=+6384.595416262" watchObservedRunningTime="2026-03-10 17:34:47.47852645 +0000 UTC m=+6384.600392137" Mar 10 17:34:48 crc kubenswrapper[4749]: I0310 17:34:48.607216 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:34:48 crc kubenswrapper[4749]: E0310 17:34:48.607468 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:34:49 crc kubenswrapper[4749]: I0310 17:34:49.823797 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:49 crc kubenswrapper[4749]: I0310 17:34:49.824189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:49 crc kubenswrapper[4749]: I0310 17:34:49.869845 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:50 crc kubenswrapper[4749]: I0310 17:34:50.536566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:51 crc kubenswrapper[4749]: I0310 17:34:51.083501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7zs6p"] Mar 10 17:34:52 crc kubenswrapper[4749]: I0310 17:34:52.498944 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7zs6p" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="registry-server" containerID="cri-o://ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace" gracePeriod=2 Mar 10 17:34:52 crc kubenswrapper[4749]: I0310 17:34:52.969466 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.036257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-catalog-content\") pod \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.036592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-utilities\") pod \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.036662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fdbt\" (UniqueName: \"kubernetes.io/projected/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-kube-api-access-8fdbt\") pod \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\" (UID: \"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5\") " Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.039134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-utilities" (OuterVolumeSpecName: "utilities") pod "bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" (UID: "bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.045092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-kube-api-access-8fdbt" (OuterVolumeSpecName: "kube-api-access-8fdbt") pod "bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" (UID: "bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5"). InnerVolumeSpecName "kube-api-access-8fdbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.116325 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" (UID: "bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.139244 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.139287 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fdbt\" (UniqueName: \"kubernetes.io/projected/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-kube-api-access-8fdbt\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.139301 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.191705 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.191755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.250905 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.508269 4749 generic.go:334] "Generic (PLEG): container finished" podID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerID="ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace" exitCode=0 Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.509081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zs6p" event={"ID":"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5","Type":"ContainerDied","Data":"ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace"} Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.509109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zs6p" event={"ID":"bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5","Type":"ContainerDied","Data":"1309595c92e64e348274b18bbabd30036c78112756e8b428ba14f6cb30bc5633"} Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.509127 4749 scope.go:117] "RemoveContainer" containerID="ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.509125 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zs6p" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.530783 4749 scope.go:117] "RemoveContainer" containerID="f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.558800 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7zs6p"] Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.562100 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7zs6p"] Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.569670 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.580709 4749 scope.go:117] "RemoveContainer" containerID="257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.601833 4749 scope.go:117] "RemoveContainer" containerID="ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace" Mar 10 17:34:53 crc kubenswrapper[4749]: E0310 17:34:53.602960 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace\": container with ID starting with ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace not found: ID does not exist" containerID="ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.603007 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace"} err="failed to get container status \"ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace\": rpc error: code = NotFound desc = could not find container \"ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace\": container with ID starting with ca0bb5fa4cc56a3e838fdbba10d19b7e27450fe15d071072e53608bf0b05aace not found: ID does not exist" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.603029 4749 scope.go:117] "RemoveContainer" containerID="f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443" Mar 10 17:34:53 crc kubenswrapper[4749]: E0310 17:34:53.603480 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443\": container with ID starting with f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443 not found: ID does not exist" containerID="f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.603543 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443"} err="failed to get container status \"f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443\": rpc error: code = NotFound desc = could not find container \"f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443\": container with ID starting with f66626c907bed75473591c4e6e56e70f36062ef7138f333baa86166f5e0eb443 not found: ID does not exist" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.603585 4749 scope.go:117] "RemoveContainer" containerID="257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d" Mar 10 17:34:53 crc kubenswrapper[4749]: E0310 17:34:53.603948 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d\": container with ID starting with 257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d not found: ID does not exist" containerID="257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.603979 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d"} err="failed to get container status \"257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d\": rpc error: code = NotFound desc = could not find container \"257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d\": container with ID starting with 257f1baf1f465f9e7a8ce64779ad383ccd31de44a4a8a52245c5a062d77a8f1d not found: ID does not exist" Mar 10 17:34:53 crc kubenswrapper[4749]: I0310 17:34:53.616493 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" path="/var/lib/kubelet/pods/bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5/volumes" Mar 10 17:34:55 crc kubenswrapper[4749]: I0310 17:34:55.294666 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wfvxt"] Mar 10 17:34:55 crc kubenswrapper[4749]: I0310 17:34:55.525848 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wfvxt" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="registry-server" containerID="cri-o://1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3" gracePeriod=2 Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.003525 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.086127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-utilities\") pod \"edd11bf6-6fc9-477b-a934-06359e31ca54\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.086334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-catalog-content\") pod \"edd11bf6-6fc9-477b-a934-06359e31ca54\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.086366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzjm\" (UniqueName: \"kubernetes.io/projected/edd11bf6-6fc9-477b-a934-06359e31ca54-kube-api-access-rpzjm\") pod \"edd11bf6-6fc9-477b-a934-06359e31ca54\" (UID: \"edd11bf6-6fc9-477b-a934-06359e31ca54\") " Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.091505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-utilities" (OuterVolumeSpecName: "utilities") pod "edd11bf6-6fc9-477b-a934-06359e31ca54" (UID: "edd11bf6-6fc9-477b-a934-06359e31ca54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.094775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd11bf6-6fc9-477b-a934-06359e31ca54-kube-api-access-rpzjm" (OuterVolumeSpecName: "kube-api-access-rpzjm") pod "edd11bf6-6fc9-477b-a934-06359e31ca54" (UID: "edd11bf6-6fc9-477b-a934-06359e31ca54"). InnerVolumeSpecName "kube-api-access-rpzjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.188716 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpzjm\" (UniqueName: \"kubernetes.io/projected/edd11bf6-6fc9-477b-a934-06359e31ca54-kube-api-access-rpzjm\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.189035 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.534929 4749 generic.go:334] "Generic (PLEG): container finished" podID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerID="1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3" exitCode=0 Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.534976 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfvxt" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.534972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerDied","Data":"1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3"} Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.536212 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfvxt" event={"ID":"edd11bf6-6fc9-477b-a934-06359e31ca54","Type":"ContainerDied","Data":"4d4822672da0795c5ba126e77b2552c619f2ba5f502f89eebe04dd207e802687"} Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.536245 4749 scope.go:117] "RemoveContainer" containerID="1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.553260 4749 scope.go:117] "RemoveContainer" containerID="dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.584188 4749 scope.go:117] "RemoveContainer" containerID="1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.610652 4749 scope.go:117] "RemoveContainer" containerID="1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3" Mar 10 17:34:56 crc kubenswrapper[4749]: E0310 17:34:56.611087 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3\": container with ID starting with 1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3 not found: ID does not exist" containerID="1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.611130 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3"} err="failed to get container status \"1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3\": rpc error: code = NotFound desc = could not find container \"1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3\": container with ID starting with 1360d10d5c3932e014e6fc3bd9c48cbacc73f9b5b92d882027f77ce82262f8d3 not found: ID does not exist" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.611149 4749 scope.go:117] "RemoveContainer" containerID="dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5" Mar 10 17:34:56 crc kubenswrapper[4749]: E0310 17:34:56.611544 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5\": container with ID starting with dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5 not found: ID does not exist" containerID="dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.611570 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5"} err="failed to get container status \"dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5\": rpc error: code = NotFound desc = could not find container \"dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5\": container with ID starting with dc7c151c1975389b51c6e9ee7ab9089eb774be7579cfa0e25c520696837baac5 not found: ID does not exist" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.611584 4749 scope.go:117] "RemoveContainer" containerID="1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281" Mar 10 17:34:56 crc kubenswrapper[4749]: E0310 17:34:56.611911 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281\": container with ID starting with 1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281 not found: ID does not exist" containerID="1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281" Mar 10 17:34:56 crc kubenswrapper[4749]: I0310 17:34:56.611933 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281"} err="failed to get container status \"1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281\": rpc error: code = NotFound desc = could not find container \"1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281\": container with ID starting with 1e8de540184c3d5af0b93af932bd9fe05cedf505f654e10e564710ae9da34281 not found: ID does not exist" Mar 10 17:34:57 crc kubenswrapper[4749]: I0310 17:34:57.041058 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edd11bf6-6fc9-477b-a934-06359e31ca54" (UID: "edd11bf6-6fc9-477b-a934-06359e31ca54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:34:57 crc kubenswrapper[4749]: I0310 17:34:57.103973 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd11bf6-6fc9-477b-a934-06359e31ca54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:34:57 crc kubenswrapper[4749]: I0310 17:34:57.171429 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wfvxt"] Mar 10 17:34:57 crc kubenswrapper[4749]: I0310 17:34:57.180228 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wfvxt"] Mar 10 17:34:57 crc kubenswrapper[4749]: I0310 17:34:57.615167 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" path="/var/lib/kubelet/pods/edd11bf6-6fc9-477b-a934-06359e31ca54/volumes" Mar 10 17:35:00 crc kubenswrapper[4749]: I0310 17:35:00.607032 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:35:01 crc kubenswrapper[4749]: I0310 17:35:01.575322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"1aac784073c11425463c0895d5eaa15a3a702ef7c8a1bfa3648c827673147728"} Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.153465 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552736-5rfjr"] Mar 10 17:36:00 crc kubenswrapper[4749]: E0310 17:36:00.154969 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="extract-utilities" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.154991 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="extract-utilities" Mar 10 17:36:00 crc kubenswrapper[4749]: E0310 17:36:00.155003 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="registry-server" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155012 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="registry-server" Mar 10 17:36:00 crc kubenswrapper[4749]: E0310 17:36:00.155032 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="extract-content" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155039 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="extract-content" Mar 10 17:36:00 crc kubenswrapper[4749]: E0310 17:36:00.155055 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="extract-content" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155061 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="extract-content" Mar 10 17:36:00 crc kubenswrapper[4749]: E0310 17:36:00.155079 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="extract-utilities" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155086 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="extract-utilities" Mar 10 17:36:00 crc kubenswrapper[4749]: E0310 17:36:00.155095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="registry-server" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155102 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="registry-server" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155283 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd11bf6-6fc9-477b-a934-06359e31ca54" containerName="registry-server" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.155307 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcaa1bc9-13c0-4ef0-a050-5c6e96144bb5" containerName="registry-server" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.156138 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.168529 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.170684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.172212 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.221618 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552736-5rfjr"] Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.304354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztplg\" (UniqueName: \"kubernetes.io/projected/f21f95d9-d2ee-446b-909b-ec2496dd4061-kube-api-access-ztplg\") pod \"auto-csr-approver-29552736-5rfjr\" (UID: \"f21f95d9-d2ee-446b-909b-ec2496dd4061\") " pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.406456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztplg\" (UniqueName: \"kubernetes.io/projected/f21f95d9-d2ee-446b-909b-ec2496dd4061-kube-api-access-ztplg\") pod \"auto-csr-approver-29552736-5rfjr\" (UID: \"f21f95d9-d2ee-446b-909b-ec2496dd4061\") " pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.431036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztplg\" (UniqueName: \"kubernetes.io/projected/f21f95d9-d2ee-446b-909b-ec2496dd4061-kube-api-access-ztplg\") pod \"auto-csr-approver-29552736-5rfjr\" (UID: \"f21f95d9-d2ee-446b-909b-ec2496dd4061\") " pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.505753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:00 crc kubenswrapper[4749]: I0310 17:36:00.955054 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552736-5rfjr"] Mar 10 17:36:01 crc kubenswrapper[4749]: I0310 17:36:01.021681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" event={"ID":"f21f95d9-d2ee-446b-909b-ec2496dd4061","Type":"ContainerStarted","Data":"12cf150f3b52374d9036acb25f1285dc5dbe80c2a2ad2564ffba21caa92cbd32"} Mar 10 17:36:03 crc kubenswrapper[4749]: I0310 17:36:03.039485 4749 generic.go:334] "Generic (PLEG): container finished" podID="f21f95d9-d2ee-446b-909b-ec2496dd4061" containerID="5f2a3259ac821d9278d8aece0fefa9e5efda096d38a63b092a3efd229ea26840" exitCode=0 Mar 10 17:36:03 crc kubenswrapper[4749]: I0310 17:36:03.039620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" event={"ID":"f21f95d9-d2ee-446b-909b-ec2496dd4061","Type":"ContainerDied","Data":"5f2a3259ac821d9278d8aece0fefa9e5efda096d38a63b092a3efd229ea26840"} Mar 10 17:36:04 crc kubenswrapper[4749]: I0310 17:36:04.401458 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:04 crc kubenswrapper[4749]: I0310 17:36:04.482931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztplg\" (UniqueName: \"kubernetes.io/projected/f21f95d9-d2ee-446b-909b-ec2496dd4061-kube-api-access-ztplg\") pod \"f21f95d9-d2ee-446b-909b-ec2496dd4061\" (UID: \"f21f95d9-d2ee-446b-909b-ec2496dd4061\") " Mar 10 17:36:04 crc kubenswrapper[4749]: I0310 17:36:04.489536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21f95d9-d2ee-446b-909b-ec2496dd4061-kube-api-access-ztplg" (OuterVolumeSpecName: "kube-api-access-ztplg") pod "f21f95d9-d2ee-446b-909b-ec2496dd4061" (UID: "f21f95d9-d2ee-446b-909b-ec2496dd4061"). InnerVolumeSpecName "kube-api-access-ztplg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:36:04 crc kubenswrapper[4749]: I0310 17:36:04.585959 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztplg\" (UniqueName: \"kubernetes.io/projected/f21f95d9-d2ee-446b-909b-ec2496dd4061-kube-api-access-ztplg\") on node \"crc\" DevicePath \"\"" Mar 10 17:36:05 crc kubenswrapper[4749]: I0310 17:36:05.057453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" event={"ID":"f21f95d9-d2ee-446b-909b-ec2496dd4061","Type":"ContainerDied","Data":"12cf150f3b52374d9036acb25f1285dc5dbe80c2a2ad2564ffba21caa92cbd32"} Mar 10 17:36:05 crc kubenswrapper[4749]: I0310 17:36:05.057511 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12cf150f3b52374d9036acb25f1285dc5dbe80c2a2ad2564ffba21caa92cbd32" Mar 10 17:36:05 crc kubenswrapper[4749]: I0310 17:36:05.057537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552736-5rfjr" Mar 10 17:36:05 crc kubenswrapper[4749]: I0310 17:36:05.471874 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552730-rj2sv"] Mar 10 17:36:05 crc kubenswrapper[4749]: I0310 17:36:05.478364 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552730-rj2sv"] Mar 10 17:36:05 crc kubenswrapper[4749]: I0310 17:36:05.618355 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14988633-5222-4217-9bc5-21260023616c" path="/var/lib/kubelet/pods/14988633-5222-4217-9bc5-21260023616c/volumes" Mar 10 17:36:28 crc kubenswrapper[4749]: I0310 17:36:28.931929 4749 scope.go:117] "RemoveContainer" containerID="c123746721d70ea9b28a0786d86bf7608ed262a00d37e1ca22736ea7f76c3a6f" Mar 10 17:37:20 crc kubenswrapper[4749]: I0310 17:37:20.980491 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:37:20 crc kubenswrapper[4749]: I0310 17:37:20.981072 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:37:50 crc kubenswrapper[4749]: I0310 17:37:50.980990 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:37:50 crc kubenswrapper[4749]: I0310 17:37:50.981893 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.892619 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xnqc/must-gather-2d6ml"] Mar 10 17:37:54 crc kubenswrapper[4749]: E0310 17:37:54.899877 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21f95d9-d2ee-446b-909b-ec2496dd4061" containerName="oc" Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.899906 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21f95d9-d2ee-446b-909b-ec2496dd4061" containerName="oc" Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.900115 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21f95d9-d2ee-446b-909b-ec2496dd4061" containerName="oc" Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.904809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.906586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8xnqc/must-gather-2d6ml"] Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.910159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8xnqc"/"kube-root-ca.crt" Mar 10 17:37:54 crc kubenswrapper[4749]: I0310 17:37:54.910427 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8xnqc"/"openshift-service-ca.crt" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.004959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-must-gather-output\") pod \"must-gather-2d6ml\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.005042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt2mv\" (UniqueName: \"kubernetes.io/projected/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-kube-api-access-dt2mv\") pod \"must-gather-2d6ml\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.106044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-must-gather-output\") pod \"must-gather-2d6ml\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.106121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt2mv\" (UniqueName: \"kubernetes.io/projected/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-kube-api-access-dt2mv\") pod \"must-gather-2d6ml\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.106669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-must-gather-output\") pod \"must-gather-2d6ml\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.126017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt2mv\" (UniqueName: \"kubernetes.io/projected/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-kube-api-access-dt2mv\") pod \"must-gather-2d6ml\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.228611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:37:55 crc kubenswrapper[4749]: I0310 17:37:55.702706 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8xnqc/must-gather-2d6ml"] Mar 10 17:37:56 crc kubenswrapper[4749]: I0310 17:37:56.108831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" event={"ID":"1f80f845-a6b7-4b34-885c-dc2c3773a9d5","Type":"ContainerStarted","Data":"1dc5bba3083766d6b13c3bbcda8a9683854451656c6a173a450382af53a18580"} Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.148783 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552738-rzd9h"] Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.150424 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.153971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.154130 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.154320 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.166821 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552738-rzd9h"] Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.204502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l67k\" (UniqueName: \"kubernetes.io/projected/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3-kube-api-access-9l67k\") pod \"auto-csr-approver-29552738-rzd9h\" (UID: \"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3\") " pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.305888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l67k\" (UniqueName: \"kubernetes.io/projected/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3-kube-api-access-9l67k\") pod \"auto-csr-approver-29552738-rzd9h\" (UID: \"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3\") " pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.325171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l67k\" (UniqueName: \"kubernetes.io/projected/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3-kube-api-access-9l67k\") pod \"auto-csr-approver-29552738-rzd9h\" (UID: \"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3\") " pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:00 crc kubenswrapper[4749]: I0310 17:38:00.486286 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:02 crc kubenswrapper[4749]: I0310 17:38:02.461538 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552738-rzd9h"] Mar 10 17:38:03 crc kubenswrapper[4749]: I0310 17:38:03.175290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" event={"ID":"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3","Type":"ContainerStarted","Data":"7f245d1bf17cdce7390190f23baa0e28c13f7d1a09d78b8acc7042082703cfd6"} Mar 10 17:38:03 crc kubenswrapper[4749]: I0310 17:38:03.177323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" event={"ID":"1f80f845-a6b7-4b34-885c-dc2c3773a9d5","Type":"ContainerStarted","Data":"ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a"} Mar 10 17:38:03 crc kubenswrapper[4749]: I0310 17:38:03.177368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" event={"ID":"1f80f845-a6b7-4b34-885c-dc2c3773a9d5","Type":"ContainerStarted","Data":"616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2"} Mar 10 17:38:03 crc kubenswrapper[4749]: I0310 17:38:03.197068 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" podStartSLOduration=2.8348834309999997 podStartE2EDuration="9.197047947s" podCreationTimestamp="2026-03-10 17:37:54 +0000 UTC" firstStartedPulling="2026-03-10 17:37:55.71271953 +0000 UTC m=+6572.834585227" lastFinishedPulling="2026-03-10 17:38:02.074884056 +0000 UTC m=+6579.196749743" observedRunningTime="2026-03-10 17:38:03.19198391 +0000 UTC m=+6580.313849617" watchObservedRunningTime="2026-03-10 17:38:03.197047947 +0000 UTC m=+6580.318913654" Mar 10 17:38:04 crc kubenswrapper[4749]: I0310 17:38:04.186894 4749 generic.go:334] "Generic (PLEG): container finished" podID="f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3" containerID="0e2f7e551cb4fd87b6e82feb2ecddabf8d9e9e09363507f3828f01a2d60aa4bf" exitCode=0 Mar 10 17:38:04 crc kubenswrapper[4749]: I0310 17:38:04.186973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" event={"ID":"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3","Type":"ContainerDied","Data":"0e2f7e551cb4fd87b6e82feb2ecddabf8d9e9e09363507f3828f01a2d60aa4bf"} Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.045213 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xnqc/crc-debug-4lg52"] Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.047319 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.050859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8xnqc"/"default-dockercfg-t7bj6" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.103805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b93363-09a7-4267-9725-684c2c1d27a0-host\") pod \"crc-debug-4lg52\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.103916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gth6f\" (UniqueName: \"kubernetes.io/projected/99b93363-09a7-4267-9725-684c2c1d27a0-kube-api-access-gth6f\") pod \"crc-debug-4lg52\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.205928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b93363-09a7-4267-9725-684c2c1d27a0-host\") pod \"crc-debug-4lg52\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.206005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gth6f\" (UniqueName: \"kubernetes.io/projected/99b93363-09a7-4267-9725-684c2c1d27a0-kube-api-access-gth6f\") pod \"crc-debug-4lg52\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.206051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b93363-09a7-4267-9725-684c2c1d27a0-host\") pod \"crc-debug-4lg52\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.224736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gth6f\" (UniqueName: \"kubernetes.io/projected/99b93363-09a7-4267-9725-684c2c1d27a0-kube-api-access-gth6f\") pod \"crc-debug-4lg52\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.366887 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:05 crc kubenswrapper[4749]: W0310 17:38:05.394140 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99b93363_09a7_4267_9725_684c2c1d27a0.slice/crio-e3ccd67dfc28b4715636c5a7a66946ccf77ac86555882014bd9df5706a315d17 WatchSource:0}: Error finding container e3ccd67dfc28b4715636c5a7a66946ccf77ac86555882014bd9df5706a315d17: Status 404 returned error can't find the container with id e3ccd67dfc28b4715636c5a7a66946ccf77ac86555882014bd9df5706a315d17 Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.479962 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.511304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l67k\" (UniqueName: \"kubernetes.io/projected/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3-kube-api-access-9l67k\") pod \"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3\" (UID: \"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3\") " Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.515266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3-kube-api-access-9l67k" (OuterVolumeSpecName: "kube-api-access-9l67k") pod "f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3" (UID: "f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3"). InnerVolumeSpecName "kube-api-access-9l67k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:38:05 crc kubenswrapper[4749]: I0310 17:38:05.613947 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l67k\" (UniqueName: \"kubernetes.io/projected/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3-kube-api-access-9l67k\") on node \"crc\" DevicePath \"\"" Mar 10 17:38:06 crc kubenswrapper[4749]: I0310 17:38:06.217423 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" event={"ID":"f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3","Type":"ContainerDied","Data":"7f245d1bf17cdce7390190f23baa0e28c13f7d1a09d78b8acc7042082703cfd6"} Mar 10 17:38:06 crc kubenswrapper[4749]: I0310 17:38:06.217466 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f245d1bf17cdce7390190f23baa0e28c13f7d1a09d78b8acc7042082703cfd6" Mar 10 17:38:06 crc kubenswrapper[4749]: I0310 17:38:06.217503 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552738-rzd9h" Mar 10 17:38:06 crc kubenswrapper[4749]: I0310 17:38:06.219892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" event={"ID":"99b93363-09a7-4267-9725-684c2c1d27a0","Type":"ContainerStarted","Data":"e3ccd67dfc28b4715636c5a7a66946ccf77ac86555882014bd9df5706a315d17"} Mar 10 17:38:06 crc kubenswrapper[4749]: I0310 17:38:06.552477 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552732-2mt6x"] Mar 10 17:38:06 crc kubenswrapper[4749]: I0310 17:38:06.558586 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552732-2mt6x"] Mar 10 17:38:07 crc kubenswrapper[4749]: I0310 17:38:07.618826 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a3a4c1-f676-48c1-8613-6ea3a6b8be8d" path="/var/lib/kubelet/pods/41a3a4c1-f676-48c1-8613-6ea3a6b8be8d/volumes" Mar 10 17:38:17 crc kubenswrapper[4749]: I0310 17:38:17.322093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" event={"ID":"99b93363-09a7-4267-9725-684c2c1d27a0","Type":"ContainerStarted","Data":"c5b3d4d55948c230771178c219577832502b81089da3d4799fe7992bd41d16a1"} Mar 10 17:38:17 crc kubenswrapper[4749]: I0310 17:38:17.344724 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" podStartSLOduration=1.114851724 podStartE2EDuration="12.344692845s" podCreationTimestamp="2026-03-10 17:38:05 +0000 UTC" firstStartedPulling="2026-03-10 17:38:05.396764681 +0000 UTC m=+6582.518630368" lastFinishedPulling="2026-03-10 17:38:16.626605802 +0000 UTC m=+6593.748471489" observedRunningTime="2026-03-10 17:38:17.34042122 +0000 UTC m=+6594.462286917" watchObservedRunningTime="2026-03-10 17:38:17.344692845 +0000 UTC m=+6594.466558572" Mar 10 17:38:20 crc kubenswrapper[4749]: I0310 17:38:20.980967 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:38:20 crc kubenswrapper[4749]: I0310 17:38:20.981401 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:38:20 crc kubenswrapper[4749]: I0310 17:38:20.981444 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:38:20 crc kubenswrapper[4749]: I0310 17:38:20.981951 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1aac784073c11425463c0895d5eaa15a3a702ef7c8a1bfa3648c827673147728"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:38:20 crc kubenswrapper[4749]: I0310 17:38:20.982005 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://1aac784073c11425463c0895d5eaa15a3a702ef7c8a1bfa3648c827673147728" gracePeriod=600 Mar 10 17:38:21 crc kubenswrapper[4749]: I0310 17:38:21.354511 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="1aac784073c11425463c0895d5eaa15a3a702ef7c8a1bfa3648c827673147728" exitCode=0 Mar 10 17:38:21 crc kubenswrapper[4749]: I0310 17:38:21.354560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"1aac784073c11425463c0895d5eaa15a3a702ef7c8a1bfa3648c827673147728"} Mar 10 17:38:21 crc kubenswrapper[4749]: I0310 17:38:21.354631 4749 scope.go:117] "RemoveContainer" containerID="65f50bda09164b450cc9c86309f0e9672f8410663d40eb80c816e00eac8bda87" Mar 10 17:38:23 crc kubenswrapper[4749]: I0310 17:38:23.373941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerStarted","Data":"5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a"} Mar 10 17:38:29 crc kubenswrapper[4749]: I0310 17:38:29.026041 4749 scope.go:117] "RemoveContainer" containerID="79c8c4b936467926790020d942e1fd1a67001a256d9b78c5fd7a4762b01ff7de" Mar 10 17:38:33 crc kubenswrapper[4749]: I0310 17:38:33.480281 4749 generic.go:334] "Generic (PLEG): container finished" podID="99b93363-09a7-4267-9725-684c2c1d27a0" containerID="c5b3d4d55948c230771178c219577832502b81089da3d4799fe7992bd41d16a1" exitCode=0 Mar 10 17:38:33 crc kubenswrapper[4749]: I0310 17:38:33.480364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" event={"ID":"99b93363-09a7-4267-9725-684c2c1d27a0","Type":"ContainerDied","Data":"c5b3d4d55948c230771178c219577832502b81089da3d4799fe7992bd41d16a1"} Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.617629 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.649885 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xnqc/crc-debug-4lg52"] Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.656994 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xnqc/crc-debug-4lg52"] Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.709246 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gth6f\" (UniqueName: \"kubernetes.io/projected/99b93363-09a7-4267-9725-684c2c1d27a0-kube-api-access-gth6f\") pod \"99b93363-09a7-4267-9725-684c2c1d27a0\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.709355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b93363-09a7-4267-9725-684c2c1d27a0-host\") pod \"99b93363-09a7-4267-9725-684c2c1d27a0\" (UID: \"99b93363-09a7-4267-9725-684c2c1d27a0\") " Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.709481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99b93363-09a7-4267-9725-684c2c1d27a0-host" (OuterVolumeSpecName: "host") pod "99b93363-09a7-4267-9725-684c2c1d27a0" (UID: "99b93363-09a7-4267-9725-684c2c1d27a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.709862 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b93363-09a7-4267-9725-684c2c1d27a0-host\") on node \"crc\" DevicePath \"\"" Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.722618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b93363-09a7-4267-9725-684c2c1d27a0-kube-api-access-gth6f" (OuterVolumeSpecName: "kube-api-access-gth6f") pod "99b93363-09a7-4267-9725-684c2c1d27a0" (UID: "99b93363-09a7-4267-9725-684c2c1d27a0"). InnerVolumeSpecName "kube-api-access-gth6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:38:34 crc kubenswrapper[4749]: I0310 17:38:34.811086 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gth6f\" (UniqueName: \"kubernetes.io/projected/99b93363-09a7-4267-9725-684c2c1d27a0-kube-api-access-gth6f\") on node \"crc\" DevicePath \"\"" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.497741 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3ccd67dfc28b4715636c5a7a66946ccf77ac86555882014bd9df5706a315d17" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.497840 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-4lg52" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.615505 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b93363-09a7-4267-9725-684c2c1d27a0" path="/var/lib/kubelet/pods/99b93363-09a7-4267-9725-684c2c1d27a0/volumes" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.860443 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xnqc/crc-debug-vljtk"] Mar 10 17:38:35 crc kubenswrapper[4749]: E0310 17:38:35.861135 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3" containerName="oc" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.861150 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3" containerName="oc" Mar 10 17:38:35 crc kubenswrapper[4749]: E0310 17:38:35.861176 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b93363-09a7-4267-9725-684c2c1d27a0" containerName="container-00" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.861184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b93363-09a7-4267-9725-684c2c1d27a0" containerName="container-00" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.861408 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b93363-09a7-4267-9725-684c2c1d27a0" containerName="container-00" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.861433 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3" containerName="oc" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.862128 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:35 crc kubenswrapper[4749]: I0310 17:38:35.864260 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8xnqc"/"default-dockercfg-t7bj6" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.031833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvhc\" (UniqueName: \"kubernetes.io/projected/d4d8305c-b7c0-40d5-bbd9-b0185317d720-kube-api-access-jjvhc\") pod \"crc-debug-vljtk\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.032346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4d8305c-b7c0-40d5-bbd9-b0185317d720-host\") pod \"crc-debug-vljtk\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.134872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4d8305c-b7c0-40d5-bbd9-b0185317d720-host\") pod \"crc-debug-vljtk\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.134960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvhc\" (UniqueName: \"kubernetes.io/projected/d4d8305c-b7c0-40d5-bbd9-b0185317d720-kube-api-access-jjvhc\") pod \"crc-debug-vljtk\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.135055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4d8305c-b7c0-40d5-bbd9-b0185317d720-host\") pod \"crc-debug-vljtk\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.163855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvhc\" (UniqueName: \"kubernetes.io/projected/d4d8305c-b7c0-40d5-bbd9-b0185317d720-kube-api-access-jjvhc\") pod \"crc-debug-vljtk\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.180358 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.507812 4749 generic.go:334] "Generic (PLEG): container finished" podID="d4d8305c-b7c0-40d5-bbd9-b0185317d720" containerID="a9253cf1471fea1cf521995bc83284e9facac652cd8cbd4c7fbe70f44a5932a6" exitCode=1 Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.507907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/crc-debug-vljtk" event={"ID":"d4d8305c-b7c0-40d5-bbd9-b0185317d720","Type":"ContainerDied","Data":"a9253cf1471fea1cf521995bc83284e9facac652cd8cbd4c7fbe70f44a5932a6"} Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.508536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/crc-debug-vljtk" event={"ID":"d4d8305c-b7c0-40d5-bbd9-b0185317d720","Type":"ContainerStarted","Data":"71c5ae66dc379d0353345e1c29d5fdf7aa5d7e83986580e16994cafbce5a6647"} Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.555075 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xnqc/crc-debug-vljtk"] Mar 10 17:38:36 crc kubenswrapper[4749]: I0310 17:38:36.563531 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xnqc/crc-debug-vljtk"] Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.623842 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.759704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjvhc\" (UniqueName: \"kubernetes.io/projected/d4d8305c-b7c0-40d5-bbd9-b0185317d720-kube-api-access-jjvhc\") pod \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.759804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4d8305c-b7c0-40d5-bbd9-b0185317d720-host\") pod \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\" (UID: \"d4d8305c-b7c0-40d5-bbd9-b0185317d720\") " Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.760202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4d8305c-b7c0-40d5-bbd9-b0185317d720-host" (OuterVolumeSpecName: "host") pod "d4d8305c-b7c0-40d5-bbd9-b0185317d720" (UID: "d4d8305c-b7c0-40d5-bbd9-b0185317d720"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.760579 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d4d8305c-b7c0-40d5-bbd9-b0185317d720-host\") on node \"crc\" DevicePath \"\"" Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.777560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d8305c-b7c0-40d5-bbd9-b0185317d720-kube-api-access-jjvhc" (OuterVolumeSpecName: "kube-api-access-jjvhc") pod "d4d8305c-b7c0-40d5-bbd9-b0185317d720" (UID: "d4d8305c-b7c0-40d5-bbd9-b0185317d720"). InnerVolumeSpecName "kube-api-access-jjvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:38:37 crc kubenswrapper[4749]: I0310 17:38:37.861899 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjvhc\" (UniqueName: \"kubernetes.io/projected/d4d8305c-b7c0-40d5-bbd9-b0185317d720-kube-api-access-jjvhc\") on node \"crc\" DevicePath \"\"" Mar 10 17:38:38 crc kubenswrapper[4749]: I0310 17:38:38.535880 4749 scope.go:117] "RemoveContainer" containerID="a9253cf1471fea1cf521995bc83284e9facac652cd8cbd4c7fbe70f44a5932a6" Mar 10 17:38:38 crc kubenswrapper[4749]: I0310 17:38:38.535919 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/crc-debug-vljtk" Mar 10 17:38:39 crc kubenswrapper[4749]: I0310 17:38:39.619764 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d8305c-b7c0-40d5-bbd9-b0185317d720" path="/var/lib/kubelet/pods/d4d8305c-b7c0-40d5-bbd9-b0185317d720/volumes" Mar 10 17:38:56 crc kubenswrapper[4749]: I0310 17:38:56.264310 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66d5956757-8smgt_c80035b9-31e6-4623-9c53-3519f48b937c/init/0.log" Mar 10 17:38:56 crc kubenswrapper[4749]: I0310 17:38:56.433709 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66d5956757-8smgt_c80035b9-31e6-4623-9c53-3519f48b937c/init/0.log" Mar 10 17:38:56 crc kubenswrapper[4749]: I0310 17:38:56.485156 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66d5956757-8smgt_c80035b9-31e6-4623-9c53-3519f48b937c/dnsmasq-dns/0.log" Mar 10 17:38:56 crc kubenswrapper[4749]: I0310 17:38:56.622990 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-75498b5f58-p5jp4_1ed6be0f-634b-450a-b7fb-f2b679793b46/keystone-api/0.log" Mar 10 17:38:56 crc kubenswrapper[4749]: I0310 17:38:56.728141 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_166b2d31-2084-433f-9e16-2a3d865b687b/adoption/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.013202 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93b8004f-cc41-4d8d-ba10-c18e745159e6/mysql-bootstrap/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.219419 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93b8004f-cc41-4d8d-ba10-c18e745159e6/galera/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.227023 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_93b8004f-cc41-4d8d-ba10-c18e745159e6/mysql-bootstrap/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.403828 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9e297480-747d-4673-965d-dd06d23c11c1/mysql-bootstrap/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.614199 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9e297480-747d-4673-965d-dd06d23c11c1/galera/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.650762 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9e297480-747d-4673-965d-dd06d23c11c1/mysql-bootstrap/0.log" Mar 10 17:38:57 crc kubenswrapper[4749]: I0310 17:38:57.900892 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c365107c-5e98-4f1e-abf4-9efe9e71de6c/openstackclient/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.014567 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_0be86646-03b3-476c-8c66-e80ffa63fd7f/adoption/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.166758 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9ba7986f-4ceb-48e9-9813-e6e856113e7c/openstack-network-exporter/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.197221 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9ba7986f-4ceb-48e9-9813-e6e856113e7c/ovn-northd/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.230552 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_097d65c0-6a62-4e8d-8ec6-8734b2fd7e2d/memcached/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.374579 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd6f1622-50c3-4a95-80d9-e833ddc6deba/openstack-network-exporter/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.378879 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd6f1622-50c3-4a95-80d9-e833ddc6deba/ovsdbserver-nb/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.519851 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f932070d-6f40-4017-a1d2-cb205561989e/openstack-network-exporter/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.556322 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f932070d-6f40-4017-a1d2-cb205561989e/ovsdbserver-nb/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.670508 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_be6afa7c-b5a9-484f-8f55-705241c391dc/openstack-network-exporter/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.709097 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_be6afa7c-b5a9-484f-8f55-705241c391dc/ovsdbserver-nb/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.816902 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9f147f8a-662f-43aa-8698-e98aefaf1f4a/openstack-network-exporter/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.878701 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9f147f8a-662f-43aa-8698-e98aefaf1f4a/ovsdbserver-sb/0.log" Mar 10 17:38:58 crc kubenswrapper[4749]: I0310 17:38:58.963117 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_f43a27b8-17bb-4826-9d18-0441ee12086c/openstack-network-exporter/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.037299 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_f43a27b8-17bb-4826-9d18-0441ee12086c/ovsdbserver-sb/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.157544 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_f556053a-b874-43d4-a6e2-a2640c82a2bb/openstack-network-exporter/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.243777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_f556053a-b874-43d4-a6e2-a2640c82a2bb/ovsdbserver-sb/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.332505 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a89a76a3-3810-431e-8061-a35fec1eff52/setup-container/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.514858 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a89a76a3-3810-431e-8061-a35fec1eff52/setup-container/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.530737 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a89a76a3-3810-431e-8061-a35fec1eff52/rabbitmq/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.564844 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7ffde1ba-19eb-4d74-84b7-f64e82c3770f/setup-container/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.725478 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7ffde1ba-19eb-4d74-84b7-f64e82c3770f/setup-container/0.log" Mar 10 17:38:59 crc kubenswrapper[4749]: I0310 17:38:59.781996 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7ffde1ba-19eb-4d74-84b7-f64e82c3770f/rabbitmq/0.log" Mar 10 17:39:16 crc kubenswrapper[4749]: I0310 17:39:16.820661 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/util/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.019786 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/pull/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.051354 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/util/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.080678 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/pull/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.251073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/util/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.299322 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/pull/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.304301 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49bagvr5r_a0aa2d7c-e6aa-427d-99ff-6fb5b258659f/extract/0.log" Mar 10 17:39:17 crc kubenswrapper[4749]: I0310 17:39:17.991494 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-cmqsz_c0b12ff9-ef73-4f00-b0ed-655a5113714e/manager/0.log" Mar 10 17:39:18 crc kubenswrapper[4749]: I0310 17:39:18.297546 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-rk5qv_9aa25d5b-083e-4b81-ab1e-018e4305b8be/manager/0.log" Mar 10 17:39:18 crc kubenswrapper[4749]: I0310 17:39:18.432816 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-8wbhh_944a1147-1517-4491-b7ee-1d0479e25c4c/manager/0.log" Mar 10 17:39:18 crc kubenswrapper[4749]: I0310 17:39:18.680934 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fcz88_4ad2d548-d5ed-4933-9a6d-1cb903434d41/manager/0.log" Mar 10 17:39:19 crc kubenswrapper[4749]: I0310 17:39:19.130032 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-nt44l_33bd7186-cfb3-49b4-aaf1-a8015fe78fbd/manager/0.log" Mar 10 17:39:19 crc kubenswrapper[4749]: I0310 17:39:19.402667 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-5r86d_b4cb9d6b-00f0-478e-a275-2720e6f90e8a/manager/0.log" Mar 10 17:39:19 crc kubenswrapper[4749]: I0310 17:39:19.737330 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-94s5n_b0053e43-866e-4c68-b4fe-edc5b10110f2/manager/0.log" Mar 10 17:39:19 crc kubenswrapper[4749]: I0310 17:39:19.911988 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-5spgx_ce6cef40-3b60-442d-86b0-ad5b583183a4/manager/0.log" Mar 10 17:39:20 crc kubenswrapper[4749]: I0310 17:39:20.168326 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-rdtpp_6984ff81-5091-4cb9-b665-9dcd5544e193/manager/0.log" Mar 10 17:39:20 crc kubenswrapper[4749]: I0310 17:39:20.203562 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-w9j99_c7551811-07e9-4b2d-8367-8468bf446068/manager/0.log" Mar 10 17:39:20 crc kubenswrapper[4749]: I0310 17:39:20.466429 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-986pw_6b9320db-2215-4964-bbd5-7437a092fe31/manager/0.log" Mar 10 17:39:20 crc kubenswrapper[4749]: I0310 17:39:20.565334 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-kgb5j_86e55c7e-3719-4d43-9803-ec8185965320/manager/0.log" Mar 10 17:39:20 crc kubenswrapper[4749]: I0310 17:39:20.666549 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-gcgcm_3747cd39-1cb6-439f-8548-41e8f2a609f4/manager/0.log" Mar 10 17:39:20 crc kubenswrapper[4749]: I0310 17:39:20.842019 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885ft9ddh_5a87391b-1b62-4214-ae0d-07c29e9e5efa/manager/0.log" Mar 10 17:39:21 crc kubenswrapper[4749]: I0310 17:39:21.196594 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-6rcf4_97f36cd6-28eb-4a46-928c-0a1ea78da590/operator/0.log" Mar 10 17:39:21 crc kubenswrapper[4749]: I0310 17:39:21.425539 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qj8sc_a1314c71-434d-4aeb-8268-97011361d024/registry-server/0.log" Mar 10 17:39:21 crc kubenswrapper[4749]: I0310 17:39:21.533097 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-7kfw8_40b9eefc-cf39-40d7-8f08-415714ea31d9/manager/0.log" Mar 10 17:39:21 crc kubenswrapper[4749]: I0310 17:39:21.670694 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-mwm5j_369202bf-81ec-4cf9-8540-c6a05a2447aa/manager/0.log" Mar 10 17:39:22 crc kubenswrapper[4749]: I0310 17:39:22.033019 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l8vqg_f35968c8-813f-473a-9bfc-46a3ff38318e/operator/0.log" Mar 10 17:39:22 crc kubenswrapper[4749]: I0310 17:39:22.211561 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-v6kk4_fe1af8b4-2a44-478b-9936-4e3fe4d90612/manager/0.log" Mar 10 17:39:22 crc kubenswrapper[4749]: I0310 17:39:22.340343 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-tzm5k_57b4a19f-1a4b-4db9-8e25-fb3ed92e1388/manager/0.log" Mar 10 17:39:22 crc kubenswrapper[4749]: I0310 17:39:22.362365 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-6rsrj_281e49ea-bf93-4ad0-8081-eced425b1a7e/manager/0.log" Mar 10 17:39:22 crc kubenswrapper[4749]: I0310 17:39:22.509834 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-xrkmh_680e1c04-e829-45a0-a323-4d40ec62b076/manager/0.log" Mar 10 17:39:22 crc kubenswrapper[4749]: I0310 17:39:22.620932 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-fqvkq_2530b5b5-5bd6-430a-8646-f77cd6f4ceae/manager/0.log" Mar 10 17:39:29 crc kubenswrapper[4749]: I0310 17:39:29.099082 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-cl6tb_67fcadbc-6b7f-47b2-a723-544783895834/manager/0.log" Mar 10 17:39:44 crc kubenswrapper[4749]: I0310 17:39:44.269187 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-sqn6s_a31d7167-46e8-4c6f-b511-a4a86aa908f2/control-plane-machine-set-operator/0.log" Mar 10 17:39:44 crc kubenswrapper[4749]: I0310 17:39:44.648704 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8ntsm_1665ca47-2d24-469b-b53f-4d6b1b5b24c4/machine-api-operator/0.log" Mar 10 17:39:44 crc kubenswrapper[4749]: I0310 17:39:44.667801 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8ntsm_1665ca47-2d24-469b-b53f-4d6b1b5b24c4/kube-rbac-proxy/0.log" Mar 10 17:39:57 crc kubenswrapper[4749]: I0310 17:39:57.934073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9ztgr_72e98565-a830-4f8e-a99b-8430585e2763/cert-manager-controller/0.log" Mar 10 17:39:58 crc kubenswrapper[4749]: I0310 17:39:58.038675 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-n2h5b_787cdbbd-d4a2-4afb-8c0f-5c4dd773e52e/cert-manager-cainjector/0.log" Mar 10 17:39:58 crc kubenswrapper[4749]: I0310 17:39:58.167096 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-5d2nd_882b2688-d221-4a4c-8771-0df154029fcb/cert-manager-webhook/0.log" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.141267 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552740-qv8nb"] Mar 10 17:40:00 crc kubenswrapper[4749]: E0310 17:40:00.142148 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d8305c-b7c0-40d5-bbd9-b0185317d720" containerName="container-00" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.142163 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d8305c-b7c0-40d5-bbd9-b0185317d720" containerName="container-00" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.142306 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d8305c-b7c0-40d5-bbd9-b0185317d720" containerName="container-00" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.142875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.146915 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.147790 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.148770 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.160232 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552740-qv8nb"] Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.303737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dztd\" (UniqueName: \"kubernetes.io/projected/d56422b8-6200-431d-a1a8-11653b1f4919-kube-api-access-4dztd\") pod \"auto-csr-approver-29552740-qv8nb\" (UID: \"d56422b8-6200-431d-a1a8-11653b1f4919\") " pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.405499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dztd\" (UniqueName: \"kubernetes.io/projected/d56422b8-6200-431d-a1a8-11653b1f4919-kube-api-access-4dztd\") pod \"auto-csr-approver-29552740-qv8nb\" (UID: \"d56422b8-6200-431d-a1a8-11653b1f4919\") " pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.435205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dztd\" (UniqueName: \"kubernetes.io/projected/d56422b8-6200-431d-a1a8-11653b1f4919-kube-api-access-4dztd\") pod \"auto-csr-approver-29552740-qv8nb\" (UID: \"d56422b8-6200-431d-a1a8-11653b1f4919\") " pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.465910 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.945299 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552740-qv8nb"] Mar 10 17:40:00 crc kubenswrapper[4749]: I0310 17:40:00.953411 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:40:01 crc kubenswrapper[4749]: I0310 17:40:01.221162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" event={"ID":"d56422b8-6200-431d-a1a8-11653b1f4919","Type":"ContainerStarted","Data":"50fc36611248fce4bcd2a9d787336f4bdace59db3918f454eff8ff0bad52e5b9"} Mar 10 17:40:03 crc kubenswrapper[4749]: I0310 17:40:03.240858 4749 generic.go:334] "Generic (PLEG): container finished" podID="d56422b8-6200-431d-a1a8-11653b1f4919" containerID="146507725c2548997dc23c39e3ff63aab76185ee884d3bca5a744c94f626e4b9" exitCode=0 Mar 10 17:40:03 crc kubenswrapper[4749]: I0310 17:40:03.240961 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" event={"ID":"d56422b8-6200-431d-a1a8-11653b1f4919","Type":"ContainerDied","Data":"146507725c2548997dc23c39e3ff63aab76185ee884d3bca5a744c94f626e4b9"} Mar 10 17:40:04 crc kubenswrapper[4749]: I0310 17:40:04.550235 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:04 crc kubenswrapper[4749]: I0310 17:40:04.684612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dztd\" (UniqueName: \"kubernetes.io/projected/d56422b8-6200-431d-a1a8-11653b1f4919-kube-api-access-4dztd\") pod \"d56422b8-6200-431d-a1a8-11653b1f4919\" (UID: \"d56422b8-6200-431d-a1a8-11653b1f4919\") " Mar 10 17:40:04 crc kubenswrapper[4749]: I0310 17:40:04.690608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56422b8-6200-431d-a1a8-11653b1f4919-kube-api-access-4dztd" (OuterVolumeSpecName: "kube-api-access-4dztd") pod "d56422b8-6200-431d-a1a8-11653b1f4919" (UID: "d56422b8-6200-431d-a1a8-11653b1f4919"). InnerVolumeSpecName "kube-api-access-4dztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:40:04 crc kubenswrapper[4749]: I0310 17:40:04.786886 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dztd\" (UniqueName: \"kubernetes.io/projected/d56422b8-6200-431d-a1a8-11653b1f4919-kube-api-access-4dztd\") on node \"crc\" DevicePath \"\"" Mar 10 17:40:05 crc kubenswrapper[4749]: I0310 17:40:05.256805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" event={"ID":"d56422b8-6200-431d-a1a8-11653b1f4919","Type":"ContainerDied","Data":"50fc36611248fce4bcd2a9d787336f4bdace59db3918f454eff8ff0bad52e5b9"} Mar 10 17:40:05 crc kubenswrapper[4749]: I0310 17:40:05.256848 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50fc36611248fce4bcd2a9d787336f4bdace59db3918f454eff8ff0bad52e5b9" Mar 10 17:40:05 crc kubenswrapper[4749]: I0310 17:40:05.256858 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552740-qv8nb" Mar 10 17:40:05 crc kubenswrapper[4749]: I0310 17:40:05.619684 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552734-bkqrn"] Mar 10 17:40:05 crc kubenswrapper[4749]: I0310 17:40:05.627199 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552734-bkqrn"] Mar 10 17:40:07 crc kubenswrapper[4749]: I0310 17:40:07.616172 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d38ec74-2164-42f9-bb6b-046acc6dec25" path="/var/lib/kubelet/pods/5d38ec74-2164-42f9-bb6b-046acc6dec25/volumes" Mar 10 17:40:11 crc kubenswrapper[4749]: I0310 17:40:11.197933 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-phg98_a2ab74d0-8aa2-4e6b-9ef9-d9213935bdd4/nmstate-console-plugin/0.log" Mar 10 17:40:11 crc kubenswrapper[4749]: I0310 17:40:11.354564 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z4rkn_4e0d4ea3-d5f0-4da8-9dbb-9abbf15c54bb/nmstate-handler/0.log" Mar 10 17:40:11 crc kubenswrapper[4749]: I0310 17:40:11.402662 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-t8wjh_19dc9ec1-eeaa-4d4b-a800-cc90a945eef5/kube-rbac-proxy/0.log" Mar 10 17:40:11 crc kubenswrapper[4749]: I0310 17:40:11.514208 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-t8wjh_19dc9ec1-eeaa-4d4b-a800-cc90a945eef5/nmstate-metrics/0.log" Mar 10 17:40:11 crc kubenswrapper[4749]: I0310 17:40:11.575277 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-mht6p_fb72dd01-1d4a-4322-936b-60a188b23af8/nmstate-operator/0.log" Mar 10 17:40:11 crc kubenswrapper[4749]: I0310 17:40:11.714479 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-gn4cf_6951e69a-ab5b-48c2-9de9-b70d82ec527e/nmstate-webhook/0.log" Mar 10 17:40:29 crc kubenswrapper[4749]: I0310 17:40:29.165049 4749 scope.go:117] "RemoveContainer" containerID="98cab473538e43d5a81230e4b4ad80cdf3e428e8bc64853b68e5411981da21e3" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.551138 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8mpk_30390e60-a7e2-4abc-b7d6-2384bd758fdd/kube-rbac-proxy/0.log" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.774061 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-s98fv_ee72aaa3-2e8c-41d1-ae7e-c446e531300a/frr-k8s-webhook-server/0.log" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.838196 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xjbzw"] Mar 10 17:40:38 crc kubenswrapper[4749]: E0310 17:40:38.841306 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56422b8-6200-431d-a1a8-11653b1f4919" containerName="oc" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.841361 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56422b8-6200-431d-a1a8-11653b1f4919" containerName="oc" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.841587 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56422b8-6200-431d-a1a8-11653b1f4919" containerName="oc" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.844959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.865548 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjbzw"] Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.961065 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8mpk_30390e60-a7e2-4abc-b7d6-2384bd758fdd/controller/0.log" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.989950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-catalog-content\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.990058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-utilities\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:38 crc kubenswrapper[4749]: I0310 17:40:38.990148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc9wj\" (UniqueName: \"kubernetes.io/projected/fa1f7228-4818-499d-bb75-25cca9eb6b52-kube-api-access-dc9wj\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.007038 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-frr-files/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.091934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-utilities\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.092029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc9wj\" (UniqueName: \"kubernetes.io/projected/fa1f7228-4818-499d-bb75-25cca9eb6b52-kube-api-access-dc9wj\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.092077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-catalog-content\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.092520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-catalog-content\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.092732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-utilities\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.109640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc9wj\" (UniqueName: \"kubernetes.io/projected/fa1f7228-4818-499d-bb75-25cca9eb6b52-kube-api-access-dc9wj\") pod \"redhat-marketplace-xjbzw\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.183506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.206056 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-frr-files/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.276132 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-reloader/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.278309 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-metrics/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.341616 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-reloader/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.506458 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-frr-files/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.584093 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-reloader/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.654776 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-metrics/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.658573 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-metrics/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.702620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjbzw"] Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.876919 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-frr-files/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.883166 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-metrics/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.918977 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/cp-reloader/0.log" Mar 10 17:40:39 crc kubenswrapper[4749]: I0310 17:40:39.946840 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/controller/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.072798 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/frr-metrics/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.092835 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/kube-rbac-proxy/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.150237 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/kube-rbac-proxy-frr/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.315477 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/reloader/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.402600 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-59548cf5bb-pnl5v_2ab91d90-6c4b-46cb-9f09-eb6a2e1e6ad2/manager/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.546962 4749 generic.go:334] "Generic (PLEG): container finished" podID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerID="d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b" exitCode=0 Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.548332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjbzw" event={"ID":"fa1f7228-4818-499d-bb75-25cca9eb6b52","Type":"ContainerDied","Data":"d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b"} Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.548359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjbzw" event={"ID":"fa1f7228-4818-499d-bb75-25cca9eb6b52","Type":"ContainerStarted","Data":"77396cad84d21e5ce6495b79a5dacb3cf6559a3dcd2cdb7973d11cfafbe06b02"} Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.595997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64c9df7c-wvtbx_3538a56a-7b3d-4fd3-9d65-f2b9ecc9de95/webhook-server/0.log" Mar 10 17:40:40 crc kubenswrapper[4749]: I0310 17:40:40.777932 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cmdn5_150f068d-c570-4301-8ea6-aceb34c0f84b/kube-rbac-proxy/0.log" Mar 10 17:40:41 crc kubenswrapper[4749]: I0310 17:40:41.446313 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cmdn5_150f068d-c570-4301-8ea6-aceb34c0f84b/speaker/0.log" Mar 10 17:40:41 crc kubenswrapper[4749]: I0310 17:40:41.555054 4749 generic.go:334] "Generic (PLEG): container finished" podID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerID="838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786" exitCode=0 Mar 10 17:40:41 crc kubenswrapper[4749]: I0310 17:40:41.555296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjbzw" event={"ID":"fa1f7228-4818-499d-bb75-25cca9eb6b52","Type":"ContainerDied","Data":"838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786"} Mar 10 17:40:42 crc kubenswrapper[4749]: I0310 17:40:42.270048 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wxp7g_45b99996-1ced-47bc-a309-4195e4880944/frr/0.log" Mar 10 17:40:42 crc kubenswrapper[4749]: I0310 17:40:42.574139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjbzw" event={"ID":"fa1f7228-4818-499d-bb75-25cca9eb6b52","Type":"ContainerStarted","Data":"f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a"} Mar 10 17:40:42 crc kubenswrapper[4749]: I0310 17:40:42.597891 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xjbzw" podStartSLOduration=2.985587786 podStartE2EDuration="4.59787463s" podCreationTimestamp="2026-03-10 17:40:38 +0000 UTC" firstStartedPulling="2026-03-10 17:40:40.549018887 +0000 UTC m=+6737.670884574" lastFinishedPulling="2026-03-10 17:40:42.161305741 +0000 UTC m=+6739.283171418" observedRunningTime="2026-03-10 17:40:42.592072423 +0000 UTC m=+6739.713938100" watchObservedRunningTime="2026-03-10 17:40:42.59787463 +0000 UTC m=+6739.719740317" Mar 10 17:40:49 crc kubenswrapper[4749]: I0310 17:40:49.185257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:49 crc kubenswrapper[4749]: I0310 17:40:49.185876 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:49 crc kubenswrapper[4749]: I0310 17:40:49.231590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:49 crc kubenswrapper[4749]: I0310 17:40:49.685849 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:49 crc kubenswrapper[4749]: I0310 17:40:49.742314 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjbzw"] Mar 10 17:40:50 crc kubenswrapper[4749]: I0310 17:40:50.980779 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:40:50 crc kubenswrapper[4749]: I0310 17:40:50.981140 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:40:51 crc kubenswrapper[4749]: I0310 17:40:51.643392 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xjbzw" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="registry-server" containerID="cri-o://f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a" gracePeriod=2 Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.619661 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.653925 4749 generic.go:334] "Generic (PLEG): container finished" podID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerID="f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a" exitCode=0 Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.654005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjbzw" event={"ID":"fa1f7228-4818-499d-bb75-25cca9eb6b52","Type":"ContainerDied","Data":"f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a"} Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.654075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjbzw" event={"ID":"fa1f7228-4818-499d-bb75-25cca9eb6b52","Type":"ContainerDied","Data":"77396cad84d21e5ce6495b79a5dacb3cf6559a3dcd2cdb7973d11cfafbe06b02"} Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.654107 4749 scope.go:117] "RemoveContainer" containerID="f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.654479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjbzw" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.674460 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc9wj\" (UniqueName: \"kubernetes.io/projected/fa1f7228-4818-499d-bb75-25cca9eb6b52-kube-api-access-dc9wj\") pod \"fa1f7228-4818-499d-bb75-25cca9eb6b52\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.674585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-catalog-content\") pod \"fa1f7228-4818-499d-bb75-25cca9eb6b52\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.674662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-utilities\") pod \"fa1f7228-4818-499d-bb75-25cca9eb6b52\" (UID: \"fa1f7228-4818-499d-bb75-25cca9eb6b52\") " Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.676019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-utilities" (OuterVolumeSpecName: "utilities") pod "fa1f7228-4818-499d-bb75-25cca9eb6b52" (UID: "fa1f7228-4818-499d-bb75-25cca9eb6b52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.679284 4749 scope.go:117] "RemoveContainer" containerID="838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.680691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1f7228-4818-499d-bb75-25cca9eb6b52-kube-api-access-dc9wj" (OuterVolumeSpecName: "kube-api-access-dc9wj") pod "fa1f7228-4818-499d-bb75-25cca9eb6b52" (UID: "fa1f7228-4818-499d-bb75-25cca9eb6b52"). InnerVolumeSpecName "kube-api-access-dc9wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.698845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa1f7228-4818-499d-bb75-25cca9eb6b52" (UID: "fa1f7228-4818-499d-bb75-25cca9eb6b52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.724760 4749 scope.go:117] "RemoveContainer" containerID="d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.764258 4749 scope.go:117] "RemoveContainer" containerID="f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a" Mar 10 17:40:52 crc kubenswrapper[4749]: E0310 17:40:52.765321 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a\": container with ID starting with f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a not found: ID does not exist" containerID="f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.765389 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a"} err="failed to get container status \"f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a\": rpc error: code = NotFound desc = could not find container \"f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a\": container with ID starting with f2330b35f56be52d70c3e3349eb2ed962b0fdde33d2d0fc9a9a4924b9fe5033a not found: ID does not exist" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.765421 4749 scope.go:117] "RemoveContainer" containerID="838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786" Mar 10 17:40:52 crc kubenswrapper[4749]: E0310 17:40:52.765742 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786\": container with ID starting with 838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786 not found: ID does not exist" containerID="838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.765768 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786"} err="failed to get container status \"838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786\": rpc error: code = NotFound desc = could not find container \"838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786\": container with ID starting with 838dab0c483cca65f3bdfe0f8577facfabca7bed1ffc0cc1624f5e289ce20786 not found: ID does not exist" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.765784 4749 scope.go:117] "RemoveContainer" containerID="d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b" Mar 10 17:40:52 crc kubenswrapper[4749]: E0310 17:40:52.766128 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b\": container with ID starting with d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b not found: ID does not exist" containerID="d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.766157 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b"} err="failed to get container status \"d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b\": rpc error: code = NotFound desc = could not find container \"d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b\": container with ID starting with d6b928834a6670cad8e830132b2d8376d8093f9a5660fe9f7d0220d0954b4c9b not found: ID does not exist" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.776518 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.776549 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc9wj\" (UniqueName: \"kubernetes.io/projected/fa1f7228-4818-499d-bb75-25cca9eb6b52-kube-api-access-dc9wj\") on node \"crc\" DevicePath \"\"" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.776565 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa1f7228-4818-499d-bb75-25cca9eb6b52-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 17:40:52 crc kubenswrapper[4749]: I0310 17:40:52.988846 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjbzw"] Mar 10 17:40:53 crc kubenswrapper[4749]: I0310 17:40:53.000478 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjbzw"] Mar 10 17:40:53 crc kubenswrapper[4749]: I0310 17:40:53.617728 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" path="/var/lib/kubelet/pods/fa1f7228-4818-499d-bb75-25cca9eb6b52/volumes" Mar 10 17:40:53 crc kubenswrapper[4749]: I0310 17:40:53.993420 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/util/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.235312 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/pull/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.301988 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/util/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.378964 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/pull/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.531452 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/pull/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.564006 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/extract/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.565073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82f75sh_d2a63022-7abc-4ef6-81fa-da39b0121c51/util/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.729916 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/util/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.902162 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/util/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.929848 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/pull/0.log" Mar 10 17:40:54 crc kubenswrapper[4749]: I0310 17:40:54.948386 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/pull/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.127313 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/util/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.159715 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/pull/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.181041 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e58wwhd_8225dfe7-8f9f-4460-a2e1-800f515e9021/extract/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.312552 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/extract-utilities/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.505940 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/extract-content/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.536441 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/extract-content/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.542845 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/extract-utilities/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.752341 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/extract-utilities/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.759648 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/extract-content/0.log" Mar 10 17:40:55 crc kubenswrapper[4749]: I0310 17:40:55.899797 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/extract-utilities/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.130096 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/extract-utilities/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.177325 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/extract-content/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.187517 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/extract-content/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.389822 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/extract-utilities/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.441519 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/extract-content/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.662978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/util/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.871064 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wxr96_61ae455d-1747-4883-b19d-3cbe4aa77dcd/registry-server/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.882127 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/util/0.log" Mar 10 17:40:56 crc kubenswrapper[4749]: I0310 17:40:56.955931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/pull/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.105938 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/pull/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.344240 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/util/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.361533 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/extract/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.389561 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4kw8tb_b7a05570-e2b0-4d18-bad1-485091a3fdc5/pull/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.608396 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2mnc7_6904f9b8-adbe-426e-9021-0da77d658ad6/marketplace-operator/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.778410 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7h9wh_08fd18d0-2c32-414f-a725-a54c904db468/registry-server/0.log" Mar 10 17:40:57 crc kubenswrapper[4749]: I0310 17:40:57.853178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/extract-utilities/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.020560 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/extract-content/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.024475 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/extract-utilities/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.052725 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/extract-content/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.237077 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/extract-utilities/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.260225 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/extract-content/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.488950 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/extract-utilities/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.514327 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pcmnl_fb151eb3-5433-4e8c-a9ac-556a3172438a/registry-server/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.622930 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/extract-content/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.627343 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/extract-utilities/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.670850 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/extract-content/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.811369 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/extract-content/0.log" Mar 10 17:40:58 crc kubenswrapper[4749]: I0310 17:40:58.826301 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/extract-utilities/0.log" Mar 10 17:40:59 crc kubenswrapper[4749]: I0310 17:40:59.162629 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vb8jm_744279f2-54a7-4fac-97eb-857784a119fb/registry-server/0.log" Mar 10 17:41:20 crc kubenswrapper[4749]: I0310 17:41:20.980793 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:41:20 crc kubenswrapper[4749]: I0310 17:41:20.981549 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:41:50 crc kubenswrapper[4749]: I0310 17:41:50.980043 4749 patch_prober.go:28] interesting pod/machine-config-daemon-p7rts container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 17:41:50 crc kubenswrapper[4749]: I0310 17:41:50.980536 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 17:41:50 crc kubenswrapper[4749]: I0310 17:41:50.980583 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" Mar 10 17:41:50 crc kubenswrapper[4749]: I0310 17:41:50.981868 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a"} pod="openshift-machine-config-operator/machine-config-daemon-p7rts" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 17:41:50 crc kubenswrapper[4749]: I0310 17:41:50.981936 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerName="machine-config-daemon" containerID="cri-o://5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" gracePeriod=600 Mar 10 17:41:51 crc kubenswrapper[4749]: E0310 17:41:51.076141 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcbc0fc_15f3_4e4e_ae14_832adec8da50.slice/crio-conmon-5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a.scope\": RecentStats: unable to find data in memory cache]" Mar 10 17:41:51 crc kubenswrapper[4749]: E0310 17:41:51.118663 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:41:51 crc kubenswrapper[4749]: I0310 17:41:51.198165 4749 generic.go:334] "Generic (PLEG): container finished" podID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" exitCode=0 Mar 10 17:41:51 crc kubenswrapper[4749]: I0310 17:41:51.198217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" event={"ID":"ebcbc0fc-15f3-4e4e-ae14-832adec8da50","Type":"ContainerDied","Data":"5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a"} Mar 10 17:41:51 crc kubenswrapper[4749]: I0310 17:41:51.198253 4749 scope.go:117] "RemoveContainer" containerID="1aac784073c11425463c0895d5eaa15a3a702ef7c8a1bfa3648c827673147728" Mar 10 17:41:51 crc kubenswrapper[4749]: I0310 17:41:51.198950 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:41:51 crc kubenswrapper[4749]: E0310 17:41:51.199364 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.176676 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552742-2f58q"] Mar 10 17:42:00 crc kubenswrapper[4749]: E0310 17:42:00.177668 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="extract-content" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.177686 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="extract-content" Mar 10 17:42:00 crc kubenswrapper[4749]: E0310 17:42:00.177708 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="extract-utilities" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.177715 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="extract-utilities" Mar 10 17:42:00 crc kubenswrapper[4749]: E0310 17:42:00.177738 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="registry-server" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.177747 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="registry-server" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.177923 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1f7228-4818-499d-bb75-25cca9eb6b52" containerName="registry-server" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.178556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.184926 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.185559 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.185721 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.193271 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552742-2f58q"] Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.299914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdsj\" (UniqueName: \"kubernetes.io/projected/ad678657-22d9-4a76-8706-282904f670e8-kube-api-access-4tdsj\") pod \"auto-csr-approver-29552742-2f58q\" (UID: \"ad678657-22d9-4a76-8706-282904f670e8\") " pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.402781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdsj\" (UniqueName: \"kubernetes.io/projected/ad678657-22d9-4a76-8706-282904f670e8-kube-api-access-4tdsj\") pod \"auto-csr-approver-29552742-2f58q\" (UID: \"ad678657-22d9-4a76-8706-282904f670e8\") " pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.435669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdsj\" (UniqueName: \"kubernetes.io/projected/ad678657-22d9-4a76-8706-282904f670e8-kube-api-access-4tdsj\") pod \"auto-csr-approver-29552742-2f58q\" (UID: \"ad678657-22d9-4a76-8706-282904f670e8\") " pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.511644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:00 crc kubenswrapper[4749]: I0310 17:42:00.923587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552742-2f58q"] Mar 10 17:42:01 crc kubenswrapper[4749]: I0310 17:42:01.286479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552742-2f58q" event={"ID":"ad678657-22d9-4a76-8706-282904f670e8","Type":"ContainerStarted","Data":"2c9d3e9ee42164f5a98ec2c6104857accdd6bf015c7e4ddd0e3ad92052faf8dc"} Mar 10 17:42:03 crc kubenswrapper[4749]: I0310 17:42:03.305419 4749 generic.go:334] "Generic (PLEG): container finished" podID="ad678657-22d9-4a76-8706-282904f670e8" containerID="e99b3b1207d28754b3d6019892845bc4772080906be768c5c3295c509715d047" exitCode=0 Mar 10 17:42:03 crc kubenswrapper[4749]: I0310 17:42:03.305584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552742-2f58q" event={"ID":"ad678657-22d9-4a76-8706-282904f670e8","Type":"ContainerDied","Data":"e99b3b1207d28754b3d6019892845bc4772080906be768c5c3295c509715d047"} Mar 10 17:42:04 crc kubenswrapper[4749]: I0310 17:42:04.606773 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:42:04 crc kubenswrapper[4749]: E0310 17:42:04.607363 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:42:04 crc kubenswrapper[4749]: I0310 17:42:04.642314 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:04 crc kubenswrapper[4749]: I0310 17:42:04.690542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdsj\" (UniqueName: \"kubernetes.io/projected/ad678657-22d9-4a76-8706-282904f670e8-kube-api-access-4tdsj\") pod \"ad678657-22d9-4a76-8706-282904f670e8\" (UID: \"ad678657-22d9-4a76-8706-282904f670e8\") " Mar 10 17:42:04 crc kubenswrapper[4749]: I0310 17:42:04.695457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad678657-22d9-4a76-8706-282904f670e8-kube-api-access-4tdsj" (OuterVolumeSpecName: "kube-api-access-4tdsj") pod "ad678657-22d9-4a76-8706-282904f670e8" (UID: "ad678657-22d9-4a76-8706-282904f670e8"). InnerVolumeSpecName "kube-api-access-4tdsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:42:04 crc kubenswrapper[4749]: I0310 17:42:04.792657 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdsj\" (UniqueName: \"kubernetes.io/projected/ad678657-22d9-4a76-8706-282904f670e8-kube-api-access-4tdsj\") on node \"crc\" DevicePath \"\"" Mar 10 17:42:05 crc kubenswrapper[4749]: I0310 17:42:05.329769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552742-2f58q" event={"ID":"ad678657-22d9-4a76-8706-282904f670e8","Type":"ContainerDied","Data":"2c9d3e9ee42164f5a98ec2c6104857accdd6bf015c7e4ddd0e3ad92052faf8dc"} Mar 10 17:42:05 crc kubenswrapper[4749]: I0310 17:42:05.329824 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9d3e9ee42164f5a98ec2c6104857accdd6bf015c7e4ddd0e3ad92052faf8dc" Mar 10 17:42:05 crc kubenswrapper[4749]: I0310 17:42:05.330354 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552742-2f58q" Mar 10 17:42:05 crc kubenswrapper[4749]: I0310 17:42:05.722082 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552736-5rfjr"] Mar 10 17:42:05 crc kubenswrapper[4749]: I0310 17:42:05.734062 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552736-5rfjr"] Mar 10 17:42:07 crc kubenswrapper[4749]: I0310 17:42:07.623254 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21f95d9-d2ee-446b-909b-ec2496dd4061" path="/var/lib/kubelet/pods/f21f95d9-d2ee-446b-909b-ec2496dd4061/volumes" Mar 10 17:42:18 crc kubenswrapper[4749]: I0310 17:42:18.607064 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:42:18 crc kubenswrapper[4749]: E0310 17:42:18.607837 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:42:28 crc kubenswrapper[4749]: I0310 17:42:28.570136 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerID="616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2" exitCode=0 Mar 10 17:42:28 crc kubenswrapper[4749]: I0310 17:42:28.570193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" event={"ID":"1f80f845-a6b7-4b34-885c-dc2c3773a9d5","Type":"ContainerDied","Data":"616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2"} Mar 10 17:42:28 crc kubenswrapper[4749]: I0310 17:42:28.572337 4749 scope.go:117] "RemoveContainer" containerID="616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2" Mar 10 17:42:29 crc kubenswrapper[4749]: I0310 17:42:29.303124 4749 scope.go:117] "RemoveContainer" containerID="5f2a3259ac821d9278d8aece0fefa9e5efda096d38a63b092a3efd229ea26840" Mar 10 17:42:29 crc kubenswrapper[4749]: I0310 17:42:29.383095 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8xnqc_must-gather-2d6ml_1f80f845-a6b7-4b34-885c-dc2c3773a9d5/gather/0.log" Mar 10 17:42:32 crc kubenswrapper[4749]: I0310 17:42:32.606298 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:42:32 crc kubenswrapper[4749]: E0310 17:42:32.608623 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:42:37 crc kubenswrapper[4749]: I0310 17:42:37.760365 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xnqc/must-gather-2d6ml"] Mar 10 17:42:37 crc kubenswrapper[4749]: I0310 17:42:37.761010 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="copy" containerID="cri-o://ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a" gracePeriod=2 Mar 10 17:42:37 crc kubenswrapper[4749]: I0310 17:42:37.773961 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xnqc/must-gather-2d6ml"] Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.214997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8xnqc_must-gather-2d6ml_1f80f845-a6b7-4b34-885c-dc2c3773a9d5/copy/0.log" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.215816 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.270296 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-must-gather-output\") pod \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.270370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt2mv\" (UniqueName: \"kubernetes.io/projected/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-kube-api-access-dt2mv\") pod \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\" (UID: \"1f80f845-a6b7-4b34-885c-dc2c3773a9d5\") " Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.278182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-kube-api-access-dt2mv" (OuterVolumeSpecName: "kube-api-access-dt2mv") pod "1f80f845-a6b7-4b34-885c-dc2c3773a9d5" (UID: "1f80f845-a6b7-4b34-885c-dc2c3773a9d5"). InnerVolumeSpecName "kube-api-access-dt2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.372624 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt2mv\" (UniqueName: \"kubernetes.io/projected/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-kube-api-access-dt2mv\") on node \"crc\" DevicePath \"\"" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.374593 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1f80f845-a6b7-4b34-885c-dc2c3773a9d5" (UID: "1f80f845-a6b7-4b34-885c-dc2c3773a9d5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.475122 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1f80f845-a6b7-4b34-885c-dc2c3773a9d5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.660463 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8xnqc_must-gather-2d6ml_1f80f845-a6b7-4b34-885c-dc2c3773a9d5/copy/0.log" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.661554 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerID="ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a" exitCode=143 Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.661581 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xnqc/must-gather-2d6ml" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.661620 4749 scope.go:117] "RemoveContainer" containerID="ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.690525 4749 scope.go:117] "RemoveContainer" containerID="616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.761046 4749 scope.go:117] "RemoveContainer" containerID="ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a" Mar 10 17:42:38 crc kubenswrapper[4749]: E0310 17:42:38.761644 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a\": container with ID starting with ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a not found: ID does not exist" containerID="ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.761695 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a"} err="failed to get container status \"ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a\": rpc error: code = NotFound desc = could not find container \"ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a\": container with ID starting with ae391d13ce1aac252844a3ecbc795cf5f6babd5082dead627e77031ff718916a not found: ID does not exist" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.761720 4749 scope.go:117] "RemoveContainer" containerID="616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2" Mar 10 17:42:38 crc kubenswrapper[4749]: E0310 17:42:38.762320 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2\": container with ID starting with 616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2 not found: ID does not exist" containerID="616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2" Mar 10 17:42:38 crc kubenswrapper[4749]: I0310 17:42:38.762412 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2"} err="failed to get container status \"616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2\": rpc error: code = NotFound desc = could not find container \"616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2\": container with ID starting with 616fb403d48848f77d4b05d0184f475cbb65bee095ce7ae18a3204406703b5f2 not found: ID does not exist" Mar 10 17:42:39 crc kubenswrapper[4749]: I0310 17:42:39.615068 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" path="/var/lib/kubelet/pods/1f80f845-a6b7-4b34-885c-dc2c3773a9d5/volumes" Mar 10 17:42:46 crc kubenswrapper[4749]: I0310 17:42:46.606864 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:42:46 crc kubenswrapper[4749]: E0310 17:42:46.607697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:42:57 crc kubenswrapper[4749]: I0310 17:42:57.606887 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:42:57 crc kubenswrapper[4749]: E0310 17:42:57.608113 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:43:08 crc kubenswrapper[4749]: I0310 17:43:08.607473 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:43:08 crc kubenswrapper[4749]: E0310 17:43:08.608698 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:43:19 crc kubenswrapper[4749]: I0310 17:43:19.607996 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:43:19 crc kubenswrapper[4749]: E0310 17:43:19.608812 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:43:31 crc kubenswrapper[4749]: I0310 17:43:31.607258 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:43:31 crc kubenswrapper[4749]: E0310 17:43:31.607974 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:43:42 crc kubenswrapper[4749]: I0310 17:43:42.606808 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:43:42 crc kubenswrapper[4749]: E0310 17:43:42.607565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:43:54 crc kubenswrapper[4749]: I0310 17:43:54.606853 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:43:54 crc kubenswrapper[4749]: E0310 17:43:54.607859 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.137321 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552744-zqb6v"] Mar 10 17:44:00 crc kubenswrapper[4749]: E0310 17:44:00.138072 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="copy" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138084 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="copy" Mar 10 17:44:00 crc kubenswrapper[4749]: E0310 17:44:00.138091 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad678657-22d9-4a76-8706-282904f670e8" containerName="oc" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138097 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad678657-22d9-4a76-8706-282904f670e8" containerName="oc" Mar 10 17:44:00 crc kubenswrapper[4749]: E0310 17:44:00.138109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="gather" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138115 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="gather" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138248 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="copy" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138262 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad678657-22d9-4a76-8706-282904f670e8" containerName="oc" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138272 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f80f845-a6b7-4b34-885c-dc2c3773a9d5" containerName="gather" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.138818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.141782 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.142417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.144567 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vkrc7" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.155944 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552744-zqb6v"] Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.332334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rprrm\" (UniqueName: \"kubernetes.io/projected/5c4c0097-2598-4974-95d7-6d74a74233f9-kube-api-access-rprrm\") pod \"auto-csr-approver-29552744-zqb6v\" (UID: \"5c4c0097-2598-4974-95d7-6d74a74233f9\") " pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.434685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rprrm\" (UniqueName: \"kubernetes.io/projected/5c4c0097-2598-4974-95d7-6d74a74233f9-kube-api-access-rprrm\") pod \"auto-csr-approver-29552744-zqb6v\" (UID: \"5c4c0097-2598-4974-95d7-6d74a74233f9\") " pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.465020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rprrm\" (UniqueName: \"kubernetes.io/projected/5c4c0097-2598-4974-95d7-6d74a74233f9-kube-api-access-rprrm\") pod \"auto-csr-approver-29552744-zqb6v\" (UID: \"5c4c0097-2598-4974-95d7-6d74a74233f9\") " pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.496505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:00 crc kubenswrapper[4749]: I0310 17:44:00.943514 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552744-zqb6v"] Mar 10 17:44:00 crc kubenswrapper[4749]: W0310 17:44:00.949598 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c4c0097_2598_4974_95d7_6d74a74233f9.slice/crio-7b760f94337265f9951164c3fcb11685900e1eb8bdb5c95fc551946062ad9092 WatchSource:0}: Error finding container 7b760f94337265f9951164c3fcb11685900e1eb8bdb5c95fc551946062ad9092: Status 404 returned error can't find the container with id 7b760f94337265f9951164c3fcb11685900e1eb8bdb5c95fc551946062ad9092 Mar 10 17:44:01 crc kubenswrapper[4749]: I0310 17:44:01.311629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" event={"ID":"5c4c0097-2598-4974-95d7-6d74a74233f9","Type":"ContainerStarted","Data":"7b760f94337265f9951164c3fcb11685900e1eb8bdb5c95fc551946062ad9092"} Mar 10 17:44:02 crc kubenswrapper[4749]: I0310 17:44:02.320270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" event={"ID":"5c4c0097-2598-4974-95d7-6d74a74233f9","Type":"ContainerStarted","Data":"f294eafaac63b2b349baa404ebcd5ff3c50fbaac0eadf8db853f529be367258f"} Mar 10 17:44:02 crc kubenswrapper[4749]: I0310 17:44:02.343076 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" podStartSLOduration=1.349275025 podStartE2EDuration="2.343044946s" podCreationTimestamp="2026-03-10 17:44:00 +0000 UTC" firstStartedPulling="2026-03-10 17:44:00.951448627 +0000 UTC m=+6938.073314314" lastFinishedPulling="2026-03-10 17:44:01.945218548 +0000 UTC m=+6939.067084235" observedRunningTime="2026-03-10 17:44:02.333031845 +0000 UTC m=+6939.454897552" watchObservedRunningTime="2026-03-10 17:44:02.343044946 +0000 UTC m=+6939.464910663" Mar 10 17:44:03 crc kubenswrapper[4749]: I0310 17:44:03.329742 4749 generic.go:334] "Generic (PLEG): container finished" podID="5c4c0097-2598-4974-95d7-6d74a74233f9" containerID="f294eafaac63b2b349baa404ebcd5ff3c50fbaac0eadf8db853f529be367258f" exitCode=0 Mar 10 17:44:03 crc kubenswrapper[4749]: I0310 17:44:03.329834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" event={"ID":"5c4c0097-2598-4974-95d7-6d74a74233f9","Type":"ContainerDied","Data":"f294eafaac63b2b349baa404ebcd5ff3c50fbaac0eadf8db853f529be367258f"} Mar 10 17:44:04 crc kubenswrapper[4749]: I0310 17:44:04.627792 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:04 crc kubenswrapper[4749]: I0310 17:44:04.809981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rprrm\" (UniqueName: \"kubernetes.io/projected/5c4c0097-2598-4974-95d7-6d74a74233f9-kube-api-access-rprrm\") pod \"5c4c0097-2598-4974-95d7-6d74a74233f9\" (UID: \"5c4c0097-2598-4974-95d7-6d74a74233f9\") " Mar 10 17:44:04 crc kubenswrapper[4749]: I0310 17:44:04.815661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4c0097-2598-4974-95d7-6d74a74233f9-kube-api-access-rprrm" (OuterVolumeSpecName: "kube-api-access-rprrm") pod "5c4c0097-2598-4974-95d7-6d74a74233f9" (UID: "5c4c0097-2598-4974-95d7-6d74a74233f9"). InnerVolumeSpecName "kube-api-access-rprrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:44:04 crc kubenswrapper[4749]: I0310 17:44:04.913147 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rprrm\" (UniqueName: \"kubernetes.io/projected/5c4c0097-2598-4974-95d7-6d74a74233f9-kube-api-access-rprrm\") on node \"crc\" DevicePath \"\"" Mar 10 17:44:05 crc kubenswrapper[4749]: I0310 17:44:05.348015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" event={"ID":"5c4c0097-2598-4974-95d7-6d74a74233f9","Type":"ContainerDied","Data":"7b760f94337265f9951164c3fcb11685900e1eb8bdb5c95fc551946062ad9092"} Mar 10 17:44:05 crc kubenswrapper[4749]: I0310 17:44:05.348076 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b760f94337265f9951164c3fcb11685900e1eb8bdb5c95fc551946062ad9092" Mar 10 17:44:05 crc kubenswrapper[4749]: I0310 17:44:05.348130 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552744-zqb6v" Mar 10 17:44:05 crc kubenswrapper[4749]: I0310 17:44:05.421065 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552738-rzd9h"] Mar 10 17:44:05 crc kubenswrapper[4749]: I0310 17:44:05.426826 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552738-rzd9h"] Mar 10 17:44:05 crc kubenswrapper[4749]: I0310 17:44:05.641937 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3" path="/var/lib/kubelet/pods/f09c6e6d-7ba2-4b5d-8aef-580b1011bbc3/volumes" Mar 10 17:44:06 crc kubenswrapper[4749]: I0310 17:44:06.606913 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:44:06 crc kubenswrapper[4749]: E0310 17:44:06.607194 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:44:20 crc kubenswrapper[4749]: I0310 17:44:20.606622 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:44:20 crc kubenswrapper[4749]: E0310 17:44:20.609046 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:44:29 crc kubenswrapper[4749]: I0310 17:44:29.408031 4749 scope.go:117] "RemoveContainer" containerID="0e2f7e551cb4fd87b6e82feb2ecddabf8d9e9e09363507f3828f01a2d60aa4bf" Mar 10 17:44:29 crc kubenswrapper[4749]: I0310 17:44:29.461194 4749 scope.go:117] "RemoveContainer" containerID="c5b3d4d55948c230771178c219577832502b81089da3d4799fe7992bd41d16a1" Mar 10 17:44:32 crc kubenswrapper[4749]: I0310 17:44:32.607333 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:44:32 crc kubenswrapper[4749]: E0310 17:44:32.607608 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:44:44 crc kubenswrapper[4749]: I0310 17:44:44.608050 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:44:44 crc kubenswrapper[4749]: E0310 17:44:44.609323 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:44:58 crc kubenswrapper[4749]: I0310 17:44:58.607498 4749 scope.go:117] "RemoveContainer" containerID="5e17cee6cdc2b86c14ac0e95ec5938f0e38017d2630dde94a1187d3cd2e0b70a" Mar 10 17:44:58 crc kubenswrapper[4749]: E0310 17:44:58.608230 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7rts_openshift-machine-config-operator(ebcbc0fc-15f3-4e4e-ae14-832adec8da50)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7rts" podUID="ebcbc0fc-15f3-4e4e-ae14-832adec8da50" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.149965 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg"] Mar 10 17:45:00 crc kubenswrapper[4749]: E0310 17:45:00.154578 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4c0097-2598-4974-95d7-6d74a74233f9" containerName="oc" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.154625 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4c0097-2598-4974-95d7-6d74a74233f9" containerName="oc" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.154841 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4c0097-2598-4974-95d7-6d74a74233f9" containerName="oc" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.155664 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.157704 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.165637 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.176798 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg"] Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.258957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f17559da-d3c4-45bf-ae30-9b035f55f7c8-config-volume\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.259031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwxd\" (UniqueName: \"kubernetes.io/projected/f17559da-d3c4-45bf-ae30-9b035f55f7c8-kube-api-access-hhwxd\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.259268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f17559da-d3c4-45bf-ae30-9b035f55f7c8-secret-volume\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.361243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwxd\" (UniqueName: \"kubernetes.io/projected/f17559da-d3c4-45bf-ae30-9b035f55f7c8-kube-api-access-hhwxd\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.361384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f17559da-d3c4-45bf-ae30-9b035f55f7c8-secret-volume\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.361444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f17559da-d3c4-45bf-ae30-9b035f55f7c8-config-volume\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.362400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f17559da-d3c4-45bf-ae30-9b035f55f7c8-config-volume\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.369094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f17559da-d3c4-45bf-ae30-9b035f55f7c8-secret-volume\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.380608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwxd\" (UniqueName: \"kubernetes.io/projected/f17559da-d3c4-45bf-ae30-9b035f55f7c8-kube-api-access-hhwxd\") pod \"collect-profiles-29552745-t4xrg\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.482588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:00 crc kubenswrapper[4749]: I0310 17:45:00.920491 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg"] Mar 10 17:45:01 crc kubenswrapper[4749]: I0310 17:45:01.833235 4749 generic.go:334] "Generic (PLEG): container finished" podID="f17559da-d3c4-45bf-ae30-9b035f55f7c8" containerID="4a73b27a991e102733b6c455a5f5026ef826e6b8e1b0908aa4d5bf351c020e9f" exitCode=0 Mar 10 17:45:01 crc kubenswrapper[4749]: I0310 17:45:01.833338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" event={"ID":"f17559da-d3c4-45bf-ae30-9b035f55f7c8","Type":"ContainerDied","Data":"4a73b27a991e102733b6c455a5f5026ef826e6b8e1b0908aa4d5bf351c020e9f"} Mar 10 17:45:01 crc kubenswrapper[4749]: I0310 17:45:01.833807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" event={"ID":"f17559da-d3c4-45bf-ae30-9b035f55f7c8","Type":"ContainerStarted","Data":"d2b8d7e805169c5e88a1994ede0e889c4f4cb7353c3e641e49e0426ce27655bf"} Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.190047 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.312017 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f17559da-d3c4-45bf-ae30-9b035f55f7c8-config-volume\") pod \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.312161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwxd\" (UniqueName: \"kubernetes.io/projected/f17559da-d3c4-45bf-ae30-9b035f55f7c8-kube-api-access-hhwxd\") pod \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.312198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f17559da-d3c4-45bf-ae30-9b035f55f7c8-secret-volume\") pod \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\" (UID: \"f17559da-d3c4-45bf-ae30-9b035f55f7c8\") " Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.312689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f17559da-d3c4-45bf-ae30-9b035f55f7c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "f17559da-d3c4-45bf-ae30-9b035f55f7c8" (UID: "f17559da-d3c4-45bf-ae30-9b035f55f7c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.313008 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f17559da-d3c4-45bf-ae30-9b035f55f7c8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.318572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17559da-d3c4-45bf-ae30-9b035f55f7c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f17559da-d3c4-45bf-ae30-9b035f55f7c8" (UID: "f17559da-d3c4-45bf-ae30-9b035f55f7c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.318945 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17559da-d3c4-45bf-ae30-9b035f55f7c8-kube-api-access-hhwxd" (OuterVolumeSpecName: "kube-api-access-hhwxd") pod "f17559da-d3c4-45bf-ae30-9b035f55f7c8" (UID: "f17559da-d3c4-45bf-ae30-9b035f55f7c8"). InnerVolumeSpecName "kube-api-access-hhwxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.414731 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwxd\" (UniqueName: \"kubernetes.io/projected/f17559da-d3c4-45bf-ae30-9b035f55f7c8-kube-api-access-hhwxd\") on node \"crc\" DevicePath \"\"" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.414764 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f17559da-d3c4-45bf-ae30-9b035f55f7c8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.850645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" event={"ID":"f17559da-d3c4-45bf-ae30-9b035f55f7c8","Type":"ContainerDied","Data":"d2b8d7e805169c5e88a1994ede0e889c4f4cb7353c3e641e49e0426ce27655bf"} Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.850682 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2b8d7e805169c5e88a1994ede0e889c4f4cb7353c3e641e49e0426ce27655bf" Mar 10 17:45:03 crc kubenswrapper[4749]: I0310 17:45:03.850714 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552745-t4xrg" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.270800 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj"] Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.277102 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552700-75mwj"] Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.396720 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wxbsm"] Mar 10 17:45:04 crc kubenswrapper[4749]: E0310 17:45:04.397333 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17559da-d3c4-45bf-ae30-9b035f55f7c8" containerName="collect-profiles" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.397346 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17559da-d3c4-45bf-ae30-9b035f55f7c8" containerName="collect-profiles" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.397521 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17559da-d3c4-45bf-ae30-9b035f55f7c8" containerName="collect-profiles" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.398765 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.424523 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxbsm"] Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.437543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/166d4f06-5015-4465-9026-2119ca64f2c2-catalog-content\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.437619 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzx5x\" (UniqueName: \"kubernetes.io/projected/166d4f06-5015-4465-9026-2119ca64f2c2-kube-api-access-nzx5x\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.437712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/166d4f06-5015-4465-9026-2119ca64f2c2-utilities\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.538816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzx5x\" (UniqueName: \"kubernetes.io/projected/166d4f06-5015-4465-9026-2119ca64f2c2-kube-api-access-nzx5x\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.538973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/166d4f06-5015-4465-9026-2119ca64f2c2-utilities\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.539016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/166d4f06-5015-4465-9026-2119ca64f2c2-catalog-content\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.539683 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/166d4f06-5015-4465-9026-2119ca64f2c2-catalog-content\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.540344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/166d4f06-5015-4465-9026-2119ca64f2c2-utilities\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.560238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzx5x\" (UniqueName: \"kubernetes.io/projected/166d4f06-5015-4465-9026-2119ca64f2c2-kube-api-access-nzx5x\") pod \"community-operators-wxbsm\" (UID: \"166d4f06-5015-4465-9026-2119ca64f2c2\") " pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:04 crc kubenswrapper[4749]: I0310 17:45:04.714394 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxbsm" Mar 10 17:45:05 crc kubenswrapper[4749]: I0310 17:45:05.212987 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxbsm"] Mar 10 17:45:05 crc kubenswrapper[4749]: W0310 17:45:05.216893 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166d4f06_5015_4465_9026_2119ca64f2c2.slice/crio-90b50328962cf2376f2e590dabd5aeadb1e1fb308d10d40b74968b92d6cacb94 WatchSource:0}: Error finding container 90b50328962cf2376f2e590dabd5aeadb1e1fb308d10d40b74968b92d6cacb94: Status 404 returned error can't find the container with id 90b50328962cf2376f2e590dabd5aeadb1e1fb308d10d40b74968b92d6cacb94 Mar 10 17:45:05 crc kubenswrapper[4749]: I0310 17:45:05.615684 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6" path="/var/lib/kubelet/pods/7b5557f9-e0d9-4d97-a1d5-7c1b79e4f5b6/volumes" Mar 10 17:45:05 crc kubenswrapper[4749]: I0310 17:45:05.878797 4749 generic.go:334] "Generic (PLEG): container finished" podID="166d4f06-5015-4465-9026-2119ca64f2c2" containerID="ef9ca3cff9ca0b4aa7243c717e53670441d03a5d2c52b47ea470ba966a4b033d" exitCode=0 Mar 10 17:45:05 crc kubenswrapper[4749]: I0310 17:45:05.879138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxbsm" event={"ID":"166d4f06-5015-4465-9026-2119ca64f2c2","Type":"ContainerDied","Data":"ef9ca3cff9ca0b4aa7243c717e53670441d03a5d2c52b47ea470ba966a4b033d"} Mar 10 17:45:05 crc kubenswrapper[4749]: I0310 17:45:05.879238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxbsm" event={"ID":"166d4f06-5015-4465-9026-2119ca64f2c2","Type":"ContainerStarted","Data":"90b50328962cf2376f2e590dabd5aeadb1e1fb308d10d40b74968b92d6cacb94"} Mar 10 17:45:05 crc kubenswrapper[4749]: I0310 17:45:05.881140 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 17:45:07 crc kubenswrapper[4749]: I0310 17:45:07.898655 4749 generic.go:334] "Generic (PLEG): container finished" podID="166d4f06-5015-4465-9026-2119ca64f2c2" containerID="6ddb881bad3e269c0d7893a93adfb9203a52dd47d357701a407cf90c9653a9c3" exitCode=0 Mar 10 17:45:07 crc kubenswrapper[4749]: I0310 17:45:07.898759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxbsm" event={"ID":"166d4f06-5015-4465-9026-2119ca64f2c2","Type":"ContainerDied","Data":"6ddb881bad3e269c0d7893a93adfb9203a52dd47d357701a407cf90c9653a9c3"} Mar 10 17:45:09 crc kubenswrapper[4749]: I0310 17:45:09.924290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxbsm" event={"ID":"166d4f06-5015-4465-9026-2119ca64f2c2","Type":"ContainerStarted","Data":"e544ff038e7c24f41a58e0547e6d1160a4728aea0b28fb70bcf68bc050904b58"} Mar 10 17:45:09 crc kubenswrapper[4749]: I0310 17:45:09.951937 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wxbsm" podStartSLOduration=3.234946668 podStartE2EDuration="5.951907829s" podCreationTimestamp="2026-03-10 17:45:04 +0000 UTC" firstStartedPulling="2026-03-10 17:45:05.880883848 +0000 UTC m=+7003.002749535" lastFinishedPulling="2026-03-10 17:45:08.597844999 +0000 UTC m=+7005.719710696" observedRunningTime="2026-03-10 17:45:09.94899119 +0000 UTC m=+7007.070856887" watchObservedRunningTime="2026-03-10 17:45:09.951907829 +0000 UTC m=+7007.073773516"